For my game, DinoSystem, I've previously used a workaround to use the discrete GPU of players without forcing them to mess with the NVidia/ATI control panels - i exported the game with NW.js and named the exe "run_game.exe", which in the Nvidia settings is tagged by default with discrete GPU usage.
I've always wanted to switch to WebView2, since Ashley has plans to fully support C3's own exported and discard NW, which seems a great idea.
The opportunity to do that arised when i've discovered that adding "--force_high_performance_gpu" to the game's args when exporting makes use to the GPU regardless of the default card settings. So i switched to WebView2.
But i recently discovered that the game, when exported, runs at 55-56 FPS - it uses the discrete GPU, but looks like it's not fully using it, and there's some subtle jankiness instea dof running butter smooth like preview. I tried to reduce the game's graphics, map size, entities number, nothing worked. Game runs at butter-smooth 60 fps in C3 preview, but 55-56 fps with subtle lag in exported.
I then enabled discrete GPU usage for WV2 exe in programs/microsoft folder, and... it fixed it.
So i'm back with NW.js, since i can just keep the "run_game" exe name and use the discete GPU of the user (again, i dont want to ask Steam users to mess with their card settings to play my game..)
Is there anything else i could do? Are there any fixes, aside --force_high_performance_gpu which doesn't fully solve it, but actually creates a new. more subtle problem? I don't want to stick with NW.js, but seems like i have no choice now.