The lower the resolution you render at the lower the demand on the gfx card.
you may not notice a difference in performance until you have a lot of sprites with a lot of overdrawing / layers but obviously the more pixels that need to be processed before up-scaling the harder the gfx card has to work. you may never see a difference with a high powered gfx card.
In your case, using high definition artwork and resizing, the setting you need to be most aware of is
"Full-screen Quality"
Setting Fullscreen quality to "LOW" renders the game at your specified "game window" (c2) . "viewport" (c3) resolution and then up-scales to the resolution of your game screen. This is what most modern games do as it allows a degree of compromise of gfx effects vs resolution, and thus control of graphics performance. Depending how low your render resolution is there will be a degree of blurriness (using linear sampling) or jagginess (using point sampling) with the upscale image.
Setting full screen quality to "High" totally ignores your "game window" resolution and renders the game at the resolution of the display screen, only keeping the aspect ratio specified by your game "game window" So if you are viewing full screen on a 1080p screen it is rendering at a full 1080p , if you are viewing on a 4K screen it is rendering at 4K. This will give you nice sharp graphics but will be hitting the gfx card hard so may result in performance degradation on lower spec machines.
you can give user access to change this setting at runtime.