Hello.
In my project items are stored outside the player for ease of access and all are part of a family.
when a player uses an item it will play an animation on the player. My desire is that I can set how long the animation should play based on how long it takes to use the item.
As of now I determine that with "item.timer * player.AnimationFrameCount
This works fine on the player side.
But on the item side I am not sure how to have the math calculate time properly. For instance say I want an item to last half a second. If I put 0.5 in for "half a second" the timer variable it plays 2 seconds. and if I set it to 1.5 for "one and a half seconds" it plays at 2 thirds a second speed.
I understand the mathematical reason why. But how can i convert 1.5 to one and a half seconds, 0.5 to half a second. and so on?
Edit: I figured it out but I will go ahead and leave this up for others to find.
all I had to do was divide one by the length of time provided in the variable. So my final formula is (1 / items.timer) * players.AnimationFrameCount
Thanks