So, theres this cube moving linearly across the screen using a timeline, and it dislocates about 10 pixels in the X axis, and about 3.5 pixels in the Y axis.
Consider these values represent timescale = 1.
So, to my surprise, when I set timescale to 0.5, it actually makes the dislocation in both axes to be a quarter, and not half.
I know my brain is probably not mathing right, but can someone explain to me why? Im very confused.
For anyone interested in seeing the example running:
drive.google.com/file/d/1kKKSs1NWihDnfvbXudm2XPHklDcwFRBG/view
Thank you in advance