Yes, the new code makes more sense
But being the smart ass I am I can't let go and need to talk about the expression 'timer' again. You are using 'Every 1000 milliseconds', which isn't a bad thing. But you are not counting seconds this way. And this is why:
Every event including every x ms can only be triggered on a tick. A tick has a duration, and that duration depends on many details, e.g. the number of events that need to be executed (something that differs from tick to tick), the processor speed, the cap is run on, even the system may interfere because of heavy hd access or other background tasks.
This all leads to totally different tick durations, even when overall the game runs with stable 30, 60 or whatever fps. So when will every x ms be executed? On the nearest tick right on or after the specified time. In other words, every 1000 ms may be executed after 1002 ms the first time, after 1011 ms the second time, after 1000 ms the third time, etc.
I know, this might not be essential for a game, but then again, if you really want to count seconds (up or down) you can only rely on 'timer' (followed by timedelta), because those are counted outside of the event loop, so whenever you ask for 'timer' you get a very accurate time measure.
But, as I said before, this might not be that important for a game