Optimal Resolution for Node.js games

0 favourites
From the Asset Store
Adjusting the game screen for different resolutions (Letterbox scale)
  • Recently I've been seeing performance issues with my NW.js game on laptops and other computers without hardware acceleration. Also I've learned about how NW export games load all the layouts into memory and never free any of it (almost like how sound is treated). However, I'm not sure overall memory is my real issue.

    My game's resolution is 1920x1080. The performance issues I am seeing aren't terrible, but I fear they will get worse. I believe most of the taxing is draw calls. I have some decent sized background sprites and I've noticed a "lag" like effect when my player goes over them (on these lower tier computers). And when objects are pinned I see them move slightly only to be snapped back into place creating a little wobble effect.

    I'm trying to gauge whether changing the game's resolution (drastic I know) would be worth it - whether I do it or not is a completely different story.

    Have there been any bench tests done on Node.js games? Any recommendations on what would be the optimal performance resolution for a larger PC game? For example, I made a pixel-art game at a jam once and that game's resolution was 200x150 which of course is up-sized when running. I assume a game at that resolution has a much faster rate of draws creating better visual performance?

  • Running in windowed mode at native or generally very small resolutions overall makes our game run much much better on entry to mid-level/average desktops and laptops, but with fullscreen resolution (even though the base resolution is less than 320x240) there is still major slow-down which is apparently caused by the upscaling/fill rate based on the past conversations regarding the performance issues of Node.JS/Node-WebKit.

    An option to force monitor resolution would be really nice, and probably the only hope of improving performance for real (and before someone swoops in to say "optimize your events before blaming the engine!" yes, we have, we have done everything we could to ensure this).

  • Advanced NW.Js games do not work well on integrated GPUs from Intel, like HD4000 or below. Intel's older iGPUs performs quite slow with OpenGL games in my experience, and C2's renderer is OpenGL.

    Interesting because a lowly AMD A6 or A8 can accelerate it great.

    My current project, Star Nomad 2 lags on Intel HD4000 or below. I don't care since my minimum spec is an AMD or NV GPU.

  • Try Construct 3

    Develop games in your browser. Powerful, performant & highly capable.

    Try Now Construct 3 users don't see these ads
  • good point, although the average Steam user is running an Intel integrated GPU/weak GPU, so advanced games (which could still be targeted for a casual/fairly wide-reaching game) are unable to reach that large chunk of audience. (source: http://store.steampowered.com/hwsurvey?platform=pc scroll down to see the most common GPU per DirectX version running)

    As for AMD, be careful there too as we have found that even computers that, based on their specs you would think they should run the game better than an Intel HD4000 laptop, have had some issues where the older cards (2010-2013 and older, basically stuff before WebGL) despite having way more "power" than an Intel CPU were having equal or more lag playing our game. Haven't seen this happen with NVIDIA yet.

  • Oh yeah, AMD GPU prior to GCN (6970 and below) is no longer supported in drivers, so they misbehave often.

    NV has dropped support for older GPUs, so 460/470/480 and less may have issues.

    I guess that's just the nature of GPU hardware, 2 years = old. 3 years = very old and 4 or more years = obsolete.

    My point is don't expect to make a complex game with C2 with lots of things happening, shaders going on often and expect it to run well on Intel GPU. It may for Haswell and Skylake (HD5000+) but in my testing, WebGL shader effects wreck performance on HD4000 or below, even in very simple games. Sad state of affair I understand.. so if any C2 dev want to target HD3000/4000, do not use WebGL effects, and also you may actually get heaps better performance with WebGL disabled, running software mode (Intel's CPU is very fast for software mode).

    This came up awhile ago, there's WebGL benchmarks and people tested their Intel GPU, it was horrendously bad.

  • Also I've learned about how NW export games load all the layouts into memory and never free any of it

    That's not true, layout-by-layout loading frees images when you change layouts, so it only ever has one layout loaded. See Memory usage.

    If you reduce the size of the window size (misleading name: should be viewport size) and use the same size window with letterbox scale, it will make no difference at all: the same rendering work is done. What does make a difference is reducing the window size and turning on "low quality" fullscreen mode. That renders the entire game smaller, then stretches the final result to the display size. It's almost the same as actually switching the monitor resolution (which would just do the final stretch on the monitor display chip, but GPUs are really fast at stretching things).

    Intel graphics chips suck. They share system memory so have really constrained bandwidth and very low fillrate limits. You have to design games very carefully to work well on them. The good news is many laptops with Intel chips are really dual-GPU and have a powerful AMD or nVidia card as well. If you can make sure everyone switches your game to the powerful GPU, all your performance issues should be resolved!

  • I agree with It seems like technology has evolved only to complicate game play/export options. It's not ideal to have to tell customers that their 2 year old laptop is not supported by your nw.js game in full screen, so they better go back to playing GTA on it instead, lol....

  • ... but with fullscreen resolution (even though the base resolution is less than 320x240) there is still major slow-down which is apparently caused by the upscaling/fill rate based on the past conversations regarding the performance issues of Node.JS/Node-WebKit

    Is this with turning on "low quality" in Project Properties?

  • That's not true, layout-by-layout loading frees images when you change layouts, so it only ever has one layout loaded.

    I have observed when playing (the NW.js export) there are 3 processes (2 of them are background processes) that end up growing as the game is being played. So far in my game it seems to cap out at around 600MB (total). I was under the impression the reason for this is that NW caches objects in memory to allow them to be loaded without any jank and that it caches each Layout as they are loaded. I'm well aware of C2's Layout memory management, I just figured it was something negated by NW.js, but if you are saying that is not the case, then that's great news. Although it still does not explain this memory usage.

  • If you can make sure everyone switches your game to the powerful GPU, all your performance issues should be resolved!

    sorry if this is a dumb question, but can you explain this "switching"? Are you saying users have control over how their systems run a NW.js game? This reminds me of how in Blender you can set the render to CPU or GPU, the latter giving you a huge boost to render times.

  • ...my minimum spec is an AMD or NV GPU.

    This is probably best to have this in the requirements, but to Colludium 's point, I can also see it frustrating users since it seems overkill for a 2D engine. But I guess.. what can you do? it is what it is..

  • I have observed when playing (the NW.js export) there are 3 processes (2 of them are background processes) that end up growing as the game is being played. So far in my game it seems to cap out at around 600MB (total).

    If you're looking in Task Manager, those numbers don't even include textures, because they're stored separately in GPU memory. Those numbers can be difficult to interpret too: see understanding CPU and memory measurements.

    [quote:n1tw30yr]sorry if this is a dumb question, but can you explain this "switching"?

    Lots and lots of laptops (even the one I'm typing on now) ship with two GPUs: a low-power, cheap and weak Intel GPU, designed to save battery, and a gaming-grade AMD or nVidia chip to run games fast. Both nVidia and AMD have really crap systems that automatically pick the GPU, and never pick the fast one for indie games, only AAA titles. For a while they wouldn't even let you manually pick the fast GPU for browsers, although I think that is fixed now. So GTA starts up on the powerful GPU and your browser or NW.js probably defaults to the weak Intel chip. Maybe if you got in touch with both of them they could whitelist your game for the powerful GPU, but you have to do it for every game you publish, and it involves trying to talk to a big corporation who probably don't care.

    So you can tell the users to do it themselves: normally there is a control panel that allows users to manually set the GPU for each program. For example on my laptop I can open nVidia settings, go to "Manage 3D settings", and manually add a program to use "High performance nVidia processor". It only lists programs nVidia think are noteworthy by default, so sometimes you have to manually add a program to the list too. However I've seen this alone entirely solve all performance issues with HTML5 games, but it is not the default, and lots of people probably don't know the option exists. So the whole system is kind of rubbish and biased against independent developers, but if you can make sure users are aware they might have a second GPU and may need to manually set the game to use it for it to run well, that might help.

  • > Also I've learned about how NW export games load all the layouts into memory and never free any of it

    >

    That's not true, layout-by-layout loading frees images when you change layouts, so it only ever has one layout loaded.

    This may be true for Construct 2, but the wrappers do NOT behave that way.

    NW.js for example has a memory cache nw.exe alongside the main nw.exe, it stores any recently accessed asset into that cache and never lets them go until the entire process is ended (exit game).

    This means if the game has 900MB of images/sprites in memory format, that cache will be ~900MB, even on a minimal main title screen layout.

    So while you change layout and C2 behaves correctly (dumping the current stuff from memory), Chromium (NW and Chrome) does not. Try it and you will see. Just make a blank layout, throw a lot of 2048 x 2048 single color sprites onto a layout that you never call (Assets Layout) while loading a blank layout at the start. You'll see the processes I am referring to, with one bloated up by however many ~16MB 2048 x 2048 textures you added.

    I know Chromium also behave this way on Android Chrome and Crosswalk, so bigger games will eventually eat up a lot of memory on the cache process.

    C2 clearing seem to affect GPU VRAM, so layout changes flushes that out correctly, but it doesn't affect the way Chromium handles cache (loading asset from disk to system memory). This is from monitoring GPU vram with Afterburner and different layout transitions. GPU VRAM is behaving correctly but system ram is never released.

  • So you can tell the users to do it themselves: normally there is a control panel that allows users to manually set the GPU for each program.

    This is great to add for support! thanks..

    My test machine is a $350 USD laptop with a HD4400 gpu. And I looked in control panel and I didn't see any 3D settings or anything like that. I'm glad to have such a low-end machine so I can see the how the game will perform on the low end of the spectrum.

    I'm just wondering if I had made the game's art low-res pixel-art and the entire resolution was 240x135 if I'd be seeing the same issues?

  • NW.js for example has a memory cache nw.exe alongside the main nw.exe, it stores any recently accessed asset into that cache and never lets them go until the entire process is ended (exit game).

    This means if the game has 900MB of images/sprites in memory format, that cache will be ~900MB, even on a minimal main title screen layout.

    Cache is not the same as used memory, which the blog post I linked to covers somewhat. Unused memory is wasted memory, so maybe Chrome or even the OS maintains a cache of the decompressed images, or it's related to garbage collection. The thing about caches though is they can be evicted to free up the memory if the system ends up running low. So in a way it doesn't count - it's not going to make you run out of memory, it just speeds things up when there's spare memory available.

    Of course it's possible there's a C2 bug that is leaking memory (although it seems unlikely by now, with many very large games running just fine for some time). The key is if it crashes due to memory exhaustion, then by all means go ahead and file a bug report. Otherwise it's probably just hanging on to resources in case they're needed again.

Jump to:
Active Users
There are 1 visitors browsing this topic (0 users and 1 guests)