I think the alternate tick system will still work on a 120Hz monitor - at least in my example, it still uses dt on alternate ticks, which means it is moving in half-as-big steps at 60 Hz instead of normal size steps at 30 Hz. If you force dt to be constant, you are making a game design choice that breaks this, and the game will run double-speed on a 120Hz monitor.
As for varying the logic update rate, I can't see it working for a general purpose engine like Construct 2. There are two options:
- run logic at a lower rate than the display rate: this is fiddly to do in a way which doesn't just look like it's running at the lower framerate. Moving objects can be stepped, but they still have to be collision tested every step, and collisions can run events that have other consequences in general, so by then you may as well just run all your logic every step. I think this can work for custom-coded games with specific requirements, but I really doubt this is something you can just turn on for most games in general and have it "just work" - I think a lot of the time, the result would be the game just looks like it updates at the lower rate as well. Or you'll get really terrible collision skips where objects basically pass through other objects without registering a collision, because they only overlapped briefly and there wasn't a "real" tick during that time.
- run logic at a higher framerate than the display rate: this basically multiplies your CPU usage and hammers performance. Say you run logic at 120 Hz and display at 60 FPS. That doubles your CPU usage. People complain enough about performance as it is, I don't think we should add features to make that worse. Also if the logic rate is not a multiple of the display rate, e.g. 90 Hz logic and 60 FPS display, this creates the uneven-stepping problem you've already seen in this thread. If an object is moving 1 pixel per frame, there aren't enough display frames to show each update, so then you get an irregular pattern of 1-2-1-2-1-2...
So I don't see that either option is really any better. C2 has always done one logic step per display frame, and I think that makes most intuitive sense anyway: any less and sometimes there is nothing new to draw, any more and you burn CPU time simulating stuff you'll never see. And C2's approach has worked well for a very wide variety of games.