If I have an object that needs to do something in 10 seconds...
I can set a tween "" from x->y for 10 seconds.
I can set a timer "" for 10 seconds.
I can use a instance variable, set it to 10, and subtract dt every tick.
(If I didn't need bulletTime / scaled dt, I could also just store the current timestamp and check that against the current time.
For both tween and timer I can use the trigger onTimer "" and OnFinished "".
For the event based method, I can simply check if variable <= 0;
I wasn't too surprised that tweens perform worse than events. They have alot of extra fluff not needed for a simple linear 1:1 dt:value change.
But timers!!!! They are the worst even though the tween is mimicking a timer and can do so much more.
In a project with 20k objects tested with event based timers, tween based timers, and timer based timers... the event based timers were the only ones that could run at 60fps. The downside is that they had the overhead of needing to check if the timer variable was <= 0 every tick. The suprise is that the "trigger" for ontimer has exactly the same overhead, but the tween doesn't.
With the timer test, if no timers are running, the "OnTimer" trigger alone causes 30% cpu usage. When activating all timers, the project runs at around 27 fps.
With the Tween test, if no tweens are being run, the project sits at o-1% cpu usage. PERFECT... until you run the tweens and then you get around 27 fps. This indicates that tweens ARE slower than timers, but since timers have the overhead for the trigger, it evens out.
Events use around 30% cpu for the conditional check to see if a timer is done, and an additional 15% to simply add dt to the timer. Total 45%.
So basically, at scale, depending on needs and how often the timer is operational vs inactive, I would use either custom events or tweens.
HOW can events outperform the timer behavior!!!! The timer should be king in performance for all timer related needs... but it isn't.