I checked out the link you posted Kayin - it's a very good article, the best I've read on the subject. However, I'm still not convinced by variable rate logic. Firstly, as the article mentions, it could substantially increase CPU usage, especially with a high logic rate. Secondly - and I don't think the article covers this - if logic runs at a rate which is not a multiple of the V-sync rate, then you can get uneven motion. Lets consider easy numbers - VSync = 50Hz and logic = 75fps - for every VSync you alternate between 1 logic update and 2 logic updates. This means between screen updates, objects will move twice as far every refresh. I don't think this would look as smooth as the current system we have. Interpolation is an interesting idea, but I fear it's too difficult to retro-fit it to our current engine (and it would make eventing a custom movement much harder).
Well glad it was a good article and a good read! Yeah, which seems why a much higher logic rate would be better -- with obvious potential problems. It'd be nice if there was a good way to tell how much cpu the logic in comparison to the display. Just for theory sake. But anyways a lot of this will depend on how easily you can program it and have it work appropriately. What issues would this have with custom movement engines, for example? is it a more technical issue?
Anyways, as I think about it,the best way to handle this might be on the event list side. Say a loop-esque feature that times an event for x times a second. Basically, updating the chosen logic at the rate desired (sort of like setting it on the timing but it will interpolate how many times it has to run it per tick). I'd set this like a function. Like
-On Independant Logic "hurfdurf"
--Game jumping/moving logic.
and then like
-on start of frame
-- set Independant Logic Rate "hurfdurf" 100 cycles per second
This might even be doable with stuff already provided. The only problem involved in this is that animations would likely not sync up appropriately and we would need some way to control that. I personally like setting all animation speeds to zero and manually forcing 'tick' updates on the animations, but thats just seems like what would be the easiest answer from my view.
Assuming there was some option to run logic at a multiple of the vsync rate (eg. 5x logic runs per vsync), and assuming timers and user input were updated between these separate logic executions, I guess it might be possible to reduce the inaccuracy by reducing the quantisation effect error. I think this could be programmed like Motion Blur, but without the rendering of logic-only runs.
[quote:1u9nnqgi]
If it's easy to program, I might give it a shot. I don't think I can truly have keyboard input update fairly in logic-only runs, but it might work. It is a very CPU intensive way of solving the problem though - I have no choice but to execute the full event list, not just the motion/timer based ones, in logic-only runs. And I don't know if it'll even help solve the problem significantly. What do you think?
By the way, I've noticed fullscreen games run with very consistent timedelta values - they are usually exactly equal to 1 / vsync rate. It's only windowed games which suffer random timedelta variations, probably because Windows treats fullscreen apps with a higher priority.
I thank you for considering this as always and hopefully we can come up with a solution to the problem some of have. I don't see how this wouldn't solve the problem -- more the issue I;'d more think is if it'd cause more problems (by being stuttery or something). Maybe I'll take a stab at making an engine that does this with the available features as a proof of concept (ignoring animation issues).