> Are you sure you meant every 0.01 seconds? Every 0.01 seconds should basically run the same as every tick at 60 fps.
>
I think that would be 0.001 sec
Nope, 0.01 is a hundredth of a second, which is faster than the framerate of most displays. I just checked, and if no framerate fluctuations occur at 60 fps, adding 1 to two variables, one with every tick and the other with every 0.01 sec, both increase at the same rate (about every 10 seconds the every tick event gets 1 ahead on my computer which currently doesn't have a rock solid 60 fps, so it's close enough to the exact same functionality).
If using every 0.01 seconds really improves performance, there's some other reason and it should be looked into.