I know dt is regarding FPS.
I know 60 is the target fps for the system.
dt*60 -is used to take FPS into consideration.
The system doesn't always have 60 FPS.
Why dt*60 and not dt*fps?
I have an idea why, but I am missing something...
dt*fps would seemingly get stuck in a feed back loop, but not sure.
I noticed in some cases *dt*60 doesn't work.
An example would be on setting a layer angle and layer scale.
When I use 1*dt*60 it happens instantly, I had to use 1*dt to get results.
Why is this?
Does timescale affect FPS?
I always show 60 FPS, but TimeScale 2.0 would make a second last half a second in real world time.
System would show 60fps regardless.
So would TimeScale 2.0 have FPS of 120 in real world time, or is it still 60?
I recently had a timescale issue.
An object would move when system timescale was set to 0.
I originally adjusted timescale of the object, but found it didn't work for an Instance variable (Adjusted Every Tick).
Some older posts suggested
Self.X -11 *dt*60
It said the formula adjusts to timescale and to fps, it does I tested it.
This implies TimeScale of 2 would do 120 ticks per real world second, correct?
I tested
Self.X - (11 * timescale)
It doesn't adjust based off of FPS it seems.
Also...
Since on the topic of TimeScale/DT/FrameRate
I saw
"Set Minimum FrameRate"
in C3.. What does this do?
I didn't understand it in manual.
Kind of seems like I answered most of my questions.
Can someone correct if I am wrong or confirm if I am right.
P.S.
If timescale formula answer is different for C3 than I would like to know it too.
I expect it to be the same, but a new expression or something may have been added.