Hi all,
a question about time.
My game adds, every tick, dt in a variable and shows its value in a Text object on
screen (just for test purpose). Timescale = 1.
On both my PCs, one running the game at about 40 fps and the other one at 60+ fps, the value showed
is almost exactly the same of that on my wrist chronometer: I start game and chronometer at the
same time, and when the chrono reaches, say, 60 seconds, on screen appears more or less
the same value.
But on a phone Motorola Moto G (Android 5.0.2, running at 40 fps, timescale = 1) things are
different: when the chrono reaches 60, the value on screen is still at 55, and other 5 or 6
more seconds are needed to reach 60. And the game runs just a bit, but visibly, slower
(this is why I performed this test). Of course I do an extensive use of deltatime in my project,
but in this case it is useless.
Note: the PCs tests are performed in preview, the phone test via a build with Intel SDK (without Crosswalk).
Probably I misunderstood something - but what? and if not... is a workaround possible? just to make
my game running at the same speed on all devices.
thanks in advance (also for the effort to read my english ) )