Hello!
Recently, I've discovered an issue with a different speed of moving objects on different PC's. At first, I thought that this is some weird bug, but after I did some proper research (which I should have been done at the start), I found the function called "delta-time". This prevents moving sprites, that do not use behaviors, from acting differently, depending on FPS and a system that runs the game. I found a tutorial and a post about that, but I still can't wrap my head around it. Let's say, that every tick a sprite moves on 6px to the right (Self.X + 6), then what will be the analogue of six pixels, if I choose to use delta-time function? Is there any formula that can be used to translate the sole number of pixels into its delta-time analogue? Many thanks in advance!
Update: Looks like the only way is to rewrite all of the functions that use simple sprite movement and find out the new "speeds" from scratch. This whole time I was seeing the distorted movement speed that was based more on the specs of my PC than on real numbers (I suddenly realised that 6px per second is much much slower that I saw everyday in my browser for the past month