Sorry to drop my big mean ass in here .... but ... i see this ...
Every 1*dt seconds vs. Every 1 seconds.
From the manual :
Common mistakes
Never use dt in the Every X seconds condition! An event like Every 1 second will run every 1 second regardless of the framerate, thus is already framerate independent. The condition measures time, not frames. If you make an event like Every 60 * dt seconds, you have inadvertently made the condition framerate dependent - the opposite of what you want to achieve! Such a condition will run every 6 seconds at 10 FPS (when dt = 0.1), or every 0.6 seconds at 100 FPS (when dt = 0.01); if you just use Every 6 seconds, it already runs every 6 seconds regardless of the framerate.
https://www.scirra.com/tutorials/67/del ... dependence
You have 3 options (to my little knowledge) to have an more accurate time measuring.
1/ The preferred one. Don not allow the game to drop under the minimum framerate.
The default minimum framerate = 30 FPS, or (written another way) a dt = 1/30. Minimum means in this case that once dt reached 1/30, it will not grow any more bigger when the FPS drops even more. This means that once below the minimum framerate, all dt depending stuff (including Every X seconds) will calculate to a (longer) non realistic time.
So, optimise the game, dont let it drop below the minimum framerate, not only 'Every X seconds' will be off, every dt using behavior will be off. Collision detection will fail. Platform jumps will jump higher. Timers are off. And more and more ....
2/ Lower the minimum framerate with the action > System > Set minimum framerate. But, if you go extreme with this, having a game that drops in FPS and recovers the next tick, the difference between dt from tick to tick can be really big. Things can and will behave erratic.
3/ Make you own timers using the System Expression 'wallclocktime' (if you have no problems with everything else behaving badly because FPS dropped below the minimum framerate)