Jump to content

Game cycle, FPS and optomisation


Julz57
 Share

Recommended Posts

I think I posted this into the wrong section before. Hope I am correct now. and yes a noob.

=======

hi,

Been reading about basic game function, engine, cycles, FPS and refresh rate and now wondering how Phaser manages all of these, if at all.

Game design is usually enabled so that the game cycle and FPS do not interfere with other too much so code works the same (aka similar playability but with varying graphical experiences) for different systems of ranging capacity. But I have not read anything about this optimisation for Phaser games. Is this because the frame rate is controlled by refresh rate of the web browser used by the player and therefore out of design control?

I have been reading Phaser game code on different sites and nowhere have I seen this optimisation implemented. While surfing I actually came across a site with a series of playable examples and associated coding (minor snippets really and minimal graphics) which seemed to stall and were jerky during play. This was in stark contrast to all my other experiences with Phaser code examples. I was using a Samsung Tab A at the time. So the question above came to mind much later as the dust was settling after a session of hunting out information.

So how does Phaser deal with the above and optimisation for a multitude of systems of varying capacity?

Does it 'sleep' in between cycles when CPU calculations are not required to conserve battery power?

Thanks

 

Link to comment
Share on other sites

  • 2 weeks later...
  • Phaser makes exactly one render for every animation frame it receives from the browser. Usually that's 60 renders/s. You can skip renders by turning on lockRender.
  • When forceSingleUpdate is on, Phaser makes exactly one logic update per animation frame. 
  • When forceSingleUpdate is off, Phaser tries to make the number of logic updates required to match desiredFps.
  • You can monitor suggestedFps and adjust desiredFps if you like.
Link to comment
Share on other sites

On 11/09/2017 at 2:08 PM, Julz57 said:

Does it 'sleep' in between cycles when CPU calculations are not required to conserve battery power?

This is a really interesting question, but its not to do with Phaser, but to do with how the browser works, which is potentially different for each browser. To answer requires pretty intimate knowledge of how the browser works (which I don't know and isn't common knowledge, maybe it should be?).

Node, as an example of implementing a JS engine (the same one in Chrome usually), and I've read this (I think), here https://nikhilm.github.io/uvbook/introduction.html (somewhere there anyway), creates an event loop that keeps spinning and handles all the events (as it can) when they come in, for specifics, this includes handling an async action like initiating a file read, popping that action initiation on to a stack and handling the response (whenever it arrives sometime in the future) and linking the action of the stack with its resolution, all so that the event loop can handle other actions that occur between initialisation and resolution. I've kind of always assumed that the browser works in a similar way to try and conserve system resources, as a JS developer working in this 'browser' vacuum, should you be concerned with this? Well, perhaps, but, there's nothing you can do about it if you don't like the answers you find.

It's certainly true that a well-written browser application will consume more resources (such as CPU cycles) than a well-written 'native' application, but its not really a fair comparison. As a JS developer you're working at a higher abstraction and you dont/cant get involved with all the low-level interactions.

When working with your average JS application (particularly targeting browsers, as most do) you're severely hamstrung with regards to implementing some of the more complex application loops common to gaming. You're almost completely restricted by requestAnimationFramerate and you can't fight it.

JS is single-threaded (threads are hard, there's lots of advantages to this), stuff like web workers could try to work in other threads and free up your UI thread (which would be your main thread in JS) as other platforms might do, so, you could try to offload your AI logic (for example) to a web worker so your main thread can push rendering at refresh rate (i.e. 60fps), but, I don't know any examples of this (some might exist as some browser games are pretty involved nowadays) and its certainly a very difficult thing to do.

Link to comment
Share on other sites

Thanks @samme and @mattstyles

Yes that makes sense now. I cannot see the point of increasing the refresh rate which is currently 60fps as per @samme . As a retired optometrist the eye cannot resolve anything more than about 25-30 fps in the central field as the information blurs into a smooth perception. Having 60 fps is therefore adequate for VR applications using the internet. Going any faster will not improve the smoothness. This may still become an issue if we go to super field screens with ultra ultra ultra high resolution where even 60fps (30fps in VR mode) may show up as jagged movement. This is more likely to occur in the peripheral field which is where the neural pathways and receptors are geared primarily to identifying change or movement. The critical fusion frequency there is much higher than the  central visual field where our screens are currently located. It may be in your lifetime but doubt it in mine. lol from an old bastard who has the pleasure of being retired and currently slack as.

Link to comment
Share on other sites

On 9/21/2017 at 4:24 PM, Julz57 said:

Thanks @samme and @mattstyles

Yes that makes sense now. I cannot see the point of increasing the refresh rate which is currently 60fps as per @samme . As a retired optometrist the eye cannot resolve anything more than about 25-30 fps in the central field as the information blurs into a smooth perception. Having 60 fps is therefore adequate for VR applications using the internet. Going any faster will not improve the smoothness. This may still become an issue if we go to super field screens with ultra ultra ultra high resolution where even 60fps (30fps in VR mode) may show up as jagged movement. This is more likely to occur in the peripheral field which is where the neural pathways and receptors are geared primarily to identifying change or movement. The critical fusion frequency there is much higher than the  central visual field where our screens are currently located. It may be in your lifetime but doubt it in mine. lol from an old bastard who has the pleasure of being retired and currently slack as.

What? VR applications needs 120 fps to be a decent VR experience. You can clearly see above 60 FPS, as there are monitors using a 120hz refresh (refreshes 120 times per second) which makes 120 fps incredibly smooth compared to 60 hz monitors (which are most screens). 60 FPS in VR would cause motion sickness

Link to comment
Share on other sites

@Legomite

"What? VR applications needs 120 fps to be a decent VR experience. You can clearly see above 60 FPS, as there are monitors using a 120hz refresh (refreshes 120 times per second) which makes 120 fps incredibly smooth compared to 60 hz monitors (which are most screens). 60 FPS in VR would cause motion sickness "

Hi Legomite,

I think there is a difference in definitions here. Critical Fusion Frequency is the rate of presentation at which the eye cannot discern that an object is turning off and then on and off and on,..... So the 60FPS is well above this value centrally. 3D Stereo TV's use LED glasses that switch from left eye to  RE and back,,, which the brain fuses into one image. But the frequency of presentation to each eye is half the fps rate (30fps). So if it is controlled by the web browser (60fps) then each eye gets 30fps viewing which it perceives as being stable.

Ever notice that fluorescent lights flicker in the peripheral field but look stable when viewed directly?

I did mention that the peripheral visual field is an altogether different beast when it comes to detection of movement and change. It is extremely sensitive to these so a higher Critical Fusion Frequency is required to stop it from being strobe like. And on top of that the possible range of angular movement is much greater making peripheral movement blocky or segmented without very high fps. Up to 220 degrees from one edge to the other whereas as central field is about 30 degrees wide.

So the fps needs for VR for full field stimulation are much higher as you correctly note. Central only VR does not need as much as we know watching 3D on TV using LED glasses.

Link to comment
Share on other sites

This is super interesting (for me at least), I've often heard that 120fps monitors are largely a way to get peeps to spend way more money than they need to, now I've got much more information about why this is indeed correct, and about the unique differences between VR refresh requirements and a regular monitor further away from your face.

Also makes you appreciate how darned incredible vision is, and we're only talking mechanically here, not about perception. Really interesting.

Link to comment
Share on other sites

@mattstylesYes it is even a bit more complicated than the information thus far. Our peripheral field is absolutely critical for spatial localisation, orientation and mobility. So poor performance in this area will cause motion sickness, nausea, poor spatial awareness leading to basically a crap experience. This may be magnified by ocular motor control disorders for people with latent turns (can be measured clinically but not apparent to a patient apart from fatigue and other functional issues), turned or lazy eyes. The setting of the primary fixation point for direct ahead viewing also needs to be calibrated for each eye for optimum results. And then add on top of this that the interior surface display contour must match the normal perception of a viewer when the eye is rotated off axis (ie left, right up down) or further spatial perception disruption is possible.

But having said all of the above the brain is highly plastic when it comes to some aspects of visual processing and integration of information from the two eyes. Think of a great accountant managing your books and doing your tax! So when first putting VR units on it feels strange but we typically adapt rather quickly to the new visual inputs. Likewise there is a bit of re-adadptation when the units are removed. Enjoy!

 

Link to comment
Share on other sites

As far as what Phaser does -> it will try to use RAF, which will trigger as fast as hardware allows, basically syncing the update loop with display refresh rate (assuming atomic updates). This is usually 60 Hz, but it can really be anything, even 160 Hz or more in the future. We should also keep in mind that freesync, gsync and hdmi 2.1's VRR exist and can allow for intermediate values (e.g. relatively stable 75 Hz or 90 Hz or anything really).

As for going over 60 FPS being pointless, I disagree. The difference in perceived smoothness between 60 and 120 FPS is extremely visible for majority of the population. There's plenty about this on the internet.

About VR, the modern VR headsets use 90 Hz for both eyes. Older stereoscopic displays needed 120 Hz to achieve 60 Hz per eye. I'm not aware of any TVs actually using 30 Hz per eye. At the very least nvidia's 3d vision always required 120 Hz display.

Web VR will once again use device's RAF and therefore run as fast as hardware allows (typically 60 Hz on mobiles and 90 Hz on desktop headsets for both eyes).

Link to comment
Share on other sites

@Antriel What you say may be  currently true as I do not know enough about current technologies. But in days gone by stereoscopic presentations were done using the alternate eye presentation method on Apple IIe and Commodore Amiga when fps were very slow! You are showing your age, but even worse, I am showing mine. lol. PAL CRT TV's were used as monitors which is a rather sad memory. But we were excited at the time even though the images / animations were blocky and rough (opposite of smooth?). Images were above CFF albeit a bit liable to breakdown at times. The coarseness was viewed as a positive in therapeutic applications as it reduced the likelihood of reducing the lazy eye from suppressing and shutting down.

Link to comment
Share on other sites

@Julz57 Yeah certainly as long as the CFF you mentioned is met, it will work (as in the 3d stereoscopic effect will be achieved). All I wanted to convey is that going above 60 FPS is certainly worthwhile and that modern VRs certainly don't work on 30 FPS. There's a difference (which you did explain even, so I'm sure you understand it better than me - again, just bluntly and succinctly repeating it for the sake of people who don't read with understanding), between 30 FPS low res stereoscopic 3d and VR headset that adjusts the reality based on the head movement.

To reiterate, going above 60 FPS in general gaming is good. Stereoscopic 3d needs 60 fps per eye to look good and VR needs 90 fps per eye to not cause too much motion sickness.

Link to comment
Share on other sites

 Share

  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...