Is C2/C3 good for large 2D desktop games?

0 favourites
From the Asset Store
Firebase Analytics
$3 USD
50% off
[C2] [C3] Support C3 build service and Android 14
  • I havn't done any significant testing, only some small ones for personal curiosity. But when it comes to rendering I didn't find any issues.

    If you have a simple test scene running at 160 fps in native, but 60fps in WebGL, it doesn't mean native is double performance, as WebGL is capped at 60fps, so you can't really compare.

    Only when you start adding loads of stuff to you can start to compare.

    Set a target fps of 30, and start spawning rotating sprites, (like in the C3 example), then you can tell the difference. The one who spawns more sprites is the winner.

    The only thing I noticed in C2/C3, is picking and loops that sometimes feels like it's using more CPU or has more overhead than it should have, and would probably be faster in javascript, or native code, but I don't have anything to compare with, so often it seems like behaviours are a lot faster for these kind of things.

    Another thing is draw calls.... I'm not sure if this is faster on native or not, but it scales with how many sprites on screen. This will use more CPU the more sprites you have on screen.

    As picking/loops and draw calls scales with # of sprites, maybe this the only bottleneck in C2? A badly optimized game, would have less resources for draw calls, so slower rendering?

    So my conclusion is that the only thing that could probably be faster in native, is draw calls and loops, but the actual rendering is just about as fast. More sprites & effects in C2, would have more overhead maybe?

    I could be wrong but that's just my observations from my own testing, and I havn't done any cross engine testing of this.

  • Try Construct 3

    Develop games in your browser. Powerful, performant & highly capable.

    Try Now Construct 3 users don't see these ads
  • I havn't done any significant testing, only some small ones for personal curiosity. But when it comes to rendering I didn't many any issues.

    If you have a simple test scene running at 160 fps in native, but 60fps in WebGL, it doesn't mean native is double performance, as WebGL is capped at 60fps, so you can't really compare.

    Disable V-sync in Chrome or in NW.js using the chromium arg/flag, to go above 60fps. That should really not stop anyone from providing good performance benchmarks.

  • This is 100% untrue blaming-shifting nonsense.

    Citation needed.

    My statement is based on three things:

    1) webglstats.com measures a wide range of devices and produces an overall support number of 96% (notwithstanding a minor recent bump). This matches data I've seen from our own measurements, and also matches what is implied from browser vendors talking about it.

    2) The architecture of WebGL is basically a thin layer on top of OpenGL calls. So it works the same as a native engine. Once the calls reach hardware the GPU performance is identical. I am strictly talking about GPU performance here; if you have a game which is slow due to CPU performance, that is unrelated to what I am stating here. To work out if your game is CPU or GPU bound, you will need to make measurements and consider the data carefully, which I've not seen any evidence of many people doing before they start complaining about what they have assumed the problem is.

    3) Many, many times I have been shouted down by someone who claims HTML5 is slow and doesn't believe this technical matter of poor quality drivers and GPU blacklisting, sometimes even accusing me of being deliberately deceptive. Then I ask for their game or some kind of evidence of why it's slow. Then I investigate. Then it turns out it was GPU blacklisting all along. If you are in the unlucky 4% it sucks, but that's not a fair representation of the wider hardware ecosystem.

    Again, the irony is it's not HTML5 that is immature - it's the graphics drivers, which are native technology. As I repeatedly have to say to people who don't seem to believe this, graphics drivers are widely accepted to be terrible and can even completely ruin game launches. Here's two links to back this up:

    Batman: Arkham Knight had such poor PC performance it virtually ruined the launch - it ran fine on console, where the drivers are good quality and predictable

    "How to delay your indie game" specifically calls out struggling with GPU drivers: "Despite claiming to, their drivers just don’t properly implement the OpenGL specifications... Then a new OSX update will hit and everything will change again, yet at the same time I will need to still cater for the previous releases and older machines... it’s unbearable... Would I support Mac again? No."

    In technology, knowing who to really blame is a surprisingly difficult and nuanced problem. It's rarely as obvious as it seems. For example lots of people, even engineers, blame the browser for bloated web pages.

    I am so tired of this that unless you produce any evidence to the contrary, I will just ignore any further complaints. I would actually be super interested to see any evidence to the contrary, because I never do. And if you don't provide any, I would kindly ask that you take a less accusatory approach to the issue.

  • Ashley - I'm just curious; how are you finding the GPU's that are blacklisted? When I search Chrome blacklist into google the only thing I can find is this:

    https://www.khronos.org/webgl/wiki/BlacklistsAndWhitelists

    Assuming this is correct, it seems to be only if you have drivers that haven't been updated in 8 years (or a couple cards I've not even heard of).

  • Just go to chrome://gpu and see what it says. WebGL 1 or 2 should say "hardware accelerated" in green. Construct will pick whichever one is hardware accelerated.

  • Ashley - Yes, mine is hardware accelerated. I think that's kind of our point - for the game I'm working on, my CPU never goes above 50%, my RAM is always less than 40mbs, and I'm not using any effects, yet somehow I'm unable to get a constant 60 fps (sometimes I sit in the 30s). If not HTML5, CPU, or GPU, what else could be causing something like that?

  • Is that in full screen?

  • Ashley - Yes, mine is hardware accelerated. I think that's kind of our point - for the game I'm working on, my CPU never goes above 50%, my RAM is always less than 40mbs, and I'm not using any effects, yet somehow I'm unable to get a constant 60 fps (sometimes I sit in the 30s). If not HTML5, CPU, or GPU, what else could be causing something like that?

    If the CPU isn't maxed out, the most likely thing is... the GPU is maxed out. So the hardware is the bottleneck. The most common reasons I've seen in practice is it's fillrate-bound (maxing out memory bandwidth), which is exactly the kind of thing which people blame on HTML5/browsers/us without realising it'd be identical in any other engine. I'd take a look at your .capx if you're willing to share it.

  • ome6a1717 How many CPU cores do you have?

  • Please share the .capx with Ashley - I get second hand frustration in these threads because users will complain and then never share the final cause of issues; would love to see some closure.

  • I am a full time Construct 2 freelancer and I've completed many projects for browser, desktop and mobiles. Apart of making games from scratch I also had many requests from my customers to finish/optimize their game.

    After several years with C2 I can clearly say that the most common performance issue with C2 is not a GPU but the C2 developers (or I should rather say their code). Of course GPU, CPU, your machine and C2 itself also takes some part in the performance issues but the majority of issues I saw was the C2 code (I mean the events of C2 devs projects).

    C2 is pretty much like JavaScript or PHP or any other language that is not very strict from architecture perspective. What I mean is that it is very easy to make a simple things with C2 (like in JS for instance) but it is really a big challenge to make a big projects the right way.

    The problem is (as mentioned above) the architecture freedom. So while making a big game people tend to use the same coding approach as for small games and in the end they are lost in the messy code which is totally unoptimized as you don't really need to care for optimization in small projects.

    With big projects it's different. There's much more code/events so much more operations for CPU and therefore you must use the best coding practices so each feature uses the least CPU as possible. And next to that you need to keep your code architecture very organized to not get lost in your big project (and that is also not very easy with C2).

    It also depends on what you call a big project. I worked with projects made of several thousands of events and after optimization they worked well on majority of modern desktop devices. Still the bigger project, the bigger challenge.

    So to wrap up, in my opinion it is possible to make a big project (not sure about a huge one) in C2 in a way to make it work very well, but it's not an easy task. Definietely not a task for a C2 newbie. It needs experience and deep understanding of how the memory managament works in C2, which conditions are the real triggers and which are fake triggers, what takes the most CPU, how to make workarounds for high CPU intensive parts etc. Without all this knowledge you will probably fail in making a big game in C2 as C2 is simply difficult for such projects.

    From the other hand if you pick Unity or any other engine you also need to learn how to use it properly.

    EDIT: Just a clarification as I received some PMs. Please note that I'm not saying that performance issues are not related to the C2 itself at all, I'm just saying that the most common issues comes from the bad coding.

  • I am a full time Construct 2 freelancer and I've completed many projects for browser, desktop and mobiles. Apart of making games from scratch I also had many requests from my customers to finish/optimize their game.

    After several years with C2 I can clearly say that the most common performance issue with C2 is not a GPU but the C2 developers (or I should rather say their code). Of course GPU, CPU, your machine and C2 itself also takes some part in the performance issues but the majority of issues I saw was the C2 code (I mean the events of C2 devs projects).

    C2 is pretty much like JavaScript or PHP or any other language that is not very strict from architecture perspective. What I mean is that it is very easy to make a simple things with C2 (like in JS for instance) but it is really a big challenge to make a big projects the right way.

    The problem is (as mentioned above) the architecture freedom. So while making a big game people tend to use the same coding approach as for small games and in the end they are lost in the messy code which is totally unoptimized as you don't really need to care for optimization in small projects.

    With big projects it's different. There's much more code/events so much more operations for CPU and therefore you must use the best coding practices so each feature uses the least CPU as possible. And next to that you need to keep your code architecture very organized to not get lost in your big project (and that is also not very easy with C2).

    It also depends on what you call a big project. I worked with projects made of several thousands of events and after optimization they worked well on majority of modern desktop devices. Still the bigger project, the bigger challenge.

    So to wrap up, in my opinion it is possible to make a big project (not sure about a huge one) in C2 in a way to make it work very well, but it's not an easy task. Definietely not a task for a C2 newbie. It needs experience and deep understanding of how the memory managament works in C2, which conditions are the real triggers and which are fake triggers, what takes the most CPU, how to make workarounds for high CPU intensive parts etc. Without all this knowledge you will probably fail in making a big game in C2 as C2 is simply difficult for such projects.

    From the other hand if you pick Unity or any other engine you also need to learn how to use it properly.

    EDIT: Just a clarification as I received some PMs. Please note that I'm not saying that performance issues are not related to the C2 itself at all, I'm just saying that the most common issues comes from the bad coding.

    I agree 100% with all of this...

  • newt - not borderless fullscreen, but maximized.

    Ashley & Elliott - I usually don't like to do this, but I'll happily send you my capx (is there a way I can send it without posting it publicly? We are releasing on steam soon and I'd prefer for others not to have my capx file )

    Also if it is fillrate-bound, how is that combated? I assume that's what it is since if I minimize to the standard resolution (640 x 360) I get a constant 60 fps, but that's a bit unrealistic to play at, and even using low settings it doesn't change anything.

    Jase00 - only 2 cores (i7 Surface book), but I believe HTML5 can only use 1 no matter what.

    - Honestly I would LOVE for it to be my bad programming, but if I disable all of my code the fps doesn't really change much. Unless, however, disabling is different from actually deleting it.

  • Honestly I would LOVE for it to be my bad programming, but if I disable all of my code the fps doesn't really change much. Unless, however, disabling is different from actually deleting it.

    Not only "bad coding" but misuse of budgeting graphics/effects/particles/objects/collisions/force texture or anything that has a behavior etc..

  • jobel - true, but I've tested all of that (deleted most of my objects, turned off all effects, AND disabled code and collision checks) to no avail.

Jump to:
Active Users
There are 1 visitors browsing this topic (0 users and 1 guests)