Comparison between Ticks and Milliseconds?

This forum is currently in read-only mode.
From the Asset Store
See how your Logical Resolution looks on different devices
  • Okay so i'm curently trying to figure out how many ticks it would take to be exactly equal to 6000 milliseconds. When the player bombs (the process takes a total of 6000 milliseconds according to the Timeline object) theres a variable that becomes a certain number and decreases by one every tick, but no matter how I calculate it I either get a delay time which isnt what I want, or a bomb starts before the previous one ends, messing up the pattern royally.

    how much does a tick vary from a millisecond?

  • Milliseconds are a measurment of time. Ticks are a measurement of game logic/rendering.

    In other words, a tick is a frame. If your game is overloaded with lots of nested loops and heavy use of shaders on an old computer, then your FPS (tick rate) could drop very low, but milliseconds will still keep going on by like normal.

    So you can't really reliably gauge time based on ticks. If your game is 60fps and you say "Every 60 ticks" then you might think you're going to activate that event once per second... but what if your game drops framerate? If your event is time sensitive then you're boned.

    Generally speaking, if you need something to happen in a specific amount of time, use milliseconds. I hardly ever use ticks, in fact I can't even remember the last time I did.

  • So I set the Bomb_Delay to 6000 just like how much time it takes to finish one bomb cycle and every 1 millisecond the game is set to drop the Bomb_Delay by 1, until it gets to 0 letting you bomb again. Obviously this doesnt work because even after a minute the delay hasnt passed, but by now I figured I could just make another timeline instead. But is there even a way to check something every millisecond?

  • Also when I try to rename a second timeline it renames the first no matter what...huh...

  • If I remember correctly the Every X Milliseconds is only accurate to around 20 milliseconds, anything less than that is iffy.

    So I would try this:

    Set bomb timer to 6 (be sure you only trigger this once)

    Every 1000 milliseconds, subtract 1 from timer.

  • So you can't really reliably gauge time based on ticks. If your game is 60fps and you say "Every 60 ticks" then you might think you're going to activate that event once per second... but what if your game drops framerate? If your event is time sensitive then you're boned.

    Can't this be avoided with time deltas?

  • Try Construct 3

    Develop games in your browser. Powerful, performant & highly capable.

    Try Now Construct 3 users don't see these ads
  • > So you can't really reliably gauge time based on ticks. If your game is 60fps and you say "Every 60 ticks" then you might think you're going to activate that event once per second... but what if your game drops framerate? If your event is time sensitive then you're boned.

    >

    Can't this be avoided with time deltas?

    Well yes, you could do something like the following:

    Set Bomb_Delay to 1000
    Always: Substract 1000*TimeDelta from Bomb_Delay
    Bomb_Delay =< 0 -> BOMB AWAY[/code:my2c5l0i]
    
    But I'd probably use the method deadeye suggested.
  • 'Every X milliseconds' cannot trigger faster than the framerate, so at 75 fps, 'Every 1 ms' will trigger every 13.3 ms instead (ie. every tick).

    The correct way is to use TimeDelta.

Jump to:
Active Users
There are 1 visitors browsing this topic (0 users and 1 guests)