Support for fixed update rates for physics games

0 favourites
  • 3 posts
From the Asset Store
Fantasy Game includes more than 600 sound effects inspired by hit computer games like World of Warcraft and Diablo.
  • As devices with higher refresh rates (120hz, 144hz, etc) are quite common, it is neccessary to use framerate independent physics most of the time.

    The problem here is (obviously) that the physics are not deterministic anymore. While you can design around this problem, it gets increasingly bad with different refresh rates. While some occasional jittering due to changing dt is something that can be worked around, it is much harder if the average dt is much lower or higher than intended.

    The differences between the physics behavior can be quite big in some instances, with (unsurprisingly) a better and smoother experience on 120hz (compared to 60hz).

    However, the interesting thing is that most of the time the display is the bottleneck - the devices would easily be able to update with 120 ticks per second and have the advantages of lower latency and smoother physics.

    As we can already test the games with unlimited ticks (while still rendering at vsync speed), is it be possible to set the update indepently from the renderer to a fixed rate (e.g., 120 ticks per second)?

  • Try Construct 3

    Develop games in your browser. Powerful, performant & highly capable.

    Try Now Construct 3 users don't see these ads
  • You can already set a fixed stepping mode, but that just fixes the physics delta-time value at 1/60 (~16ms, i.e. 60 FPS), meaning the simulation speed varies depending on the display rate.

    Small fixed timesteps are probably a better solution, but there are still some potentially awkward pitfalls - for example if you fix the physics simulation rate to 30 Hz, it moves slower than the display rate, which probably will look weird; if you set it to 60 Hz, it probably won't match a 60 Hz display refresh rate exactly since they rely on different clocks, which will probably look janky; and if you set it to a high rate like 120 Hz you start burning a lot of CPU time and it's possible the system then can't keep up, meaning you end up going in to slow-motion mode anyway.

    How bad is the non-determinism anyway? Does increasing the simulation accuracy by increasing the velocity/position iterations improve the situation? Stepping with the framerate seems like the best balance of managing the simulation speed, providing smooth gameplay, and using CPU time efficiently, but that means managing the non-determinism.

  • The moment I realized the difference was when working on a new monitor with 144hz. When connecting object with joints (e.g., building chains with revolute joint), they behave like smooth strings/chains and are completely stable. The same project 60hz behaves similar for low forces, but can easily break at obstacles (get disjoint for a short time) when applying higher forces. This is definitely fixable by preventing these forces and write additional code around it, but this in turn changes the behavior on the faster refresh rates to feel a little bit 'overtuned'.

    Increasing the iterations helps to reduce the differences, but the lower latency still had a superior result. Also, the higher accuracy affects the performance on devices with higher refresh rates, even if they would not need it. The problem is that devices with fast refresh rates are not neccessarily better - they just have a faster display. This means that the best solution would probably be to adapt the accuracy according to the framerate, to find a tradeoff between CPU usage and 'reliable physics'.

    Some devices (like some laptops) start getting ridiculously high refresh rates like 360hz, where these smaller differences in physics behavior might build up. This unfortunately means that one has not only to consider the weaker devices, but also the more powerful ones. For these it would propbably be the best to just cap the updates to a level that is considered 'save'.

    However, I can definitely see the problem with independent clocks, especially if the system can't keep up with the updates. That said, I realized that often devices are powerful enough to manage the higher update rate - they just wont do it because of their display (especially when targetting stronger/newer devices). Especially when targetting PC and laptops, the refresh rate is probably not really tied to the specs of the CPU. This leads to the undesirable situation that weaker devices might ask for high framerates, while much stronger devices are held back by their display.

    I agree that the update rate should never be set anywhere near the limit, but I would think that the cap introduced by the refresh rate is not the best option for all cases anymore.

Jump to:
Active Users
There are 1 visitors browsing this topic (0 users and 1 guests)