Silly question probably. But I'm having issues with adding (or subtracting) numbers in specific amounts every tick.
Example: I have a sprite with a "Size" instance variable, which then sets the sprite's scale. It starts at 0.1, which would equal 10% when setting the spite's scale. Upon being on screen, every tick it will add 0.01 to the instance variable, up to 1, which would be 100% scale. At that point it triggers another action and stops.
My issue is that the trigger is looking for PRECISELY 1. And the system is somehow returning 0.999999997, when I watch the variable count up in debug mode. So the trigger never activates, as the number never reaches exactly 1. This doesn't really make sense to me, as the increment it's set to add isn't even close to that decimal point. I can only assume it's an issue with the every tick behavior? Would I be better off using a specific delta-time action? I'm sure this is basic level programing knowledge. But I'm willing to look dumb, if it helps me eventually be smart 😅