Setting the fps to a certain value will only provide a smooth solution if you know the display vsync fps and then set the update rate to a whole-value fraction of the vsync value. I have an old laptop that reduces the vsync to 50 fps when the battery is being used and an android tablet that uses 50 Hz as a default (ugh..)... 30 does not divide into 50, so using 30 in that case would cause some jank or apparent uneven movement. And there are plenty of monitors that run at other vsync values that 30 cannot divide into (ie 75 hz, 100 Hz, etc).
Thus, what you should really ask for (IMO) is a graceful degrade of logic fps - a runtime method of detecting when the average logic time is greater than that required to meet the monitor vsync (most games will jank a little if the player has a 240 hz monitor, for example, because the available logic time is only 4 ms per tick, ignoring garbage collection etc). But the jank may be hidden by the overall framerate (human eye limit) if the jank is only occasional.
In a cross-platform agnostic solution the runtime would have to allow the game logic and display timing to disconnect from each other. So, in a logic-heavy game all sprites would be moved at a fraction of the vsync fps (ie 0.5 x vsync or 0.33 x vsync -> whatever gives a round number). For a 75 hz monitor, the logic would have to degrade to 25 fps. And so on. That would take a bit of a re-write of the runtime logic, I think.
How can a browser detect the user's display vsync...? I don't know of a way that a browser can obtain that directly using JS, other than timing the requestAnimationFrame() timing... Oh, and if requestAnimationFrame() is called at exactly 60.00 Hz then it's probably a failure of the browser to detect the vsync and, instead, the browser has defaulted to the most likely callback value in absence of any data (so, even then, you can't guarantee to get the vsync value).
Unless I'm missing something...?