Ashley's Forum Posts

  • You don't need to use AJAX. You can put a URL directly in the Sprite's 'Load image' action.

    The URL you shared is actually to a HTML page that contains an image, so cannot be loaded as an image itself. The direct link to the image is this: https://raw.githubusercontent.com/cybersurfer5000/image_test/a94534768231f675818f0ab67244a9ac7ade3e56/image1.png

    (right-click the image and open it in a new tab, and that URL appears in the address bar)

  • There will be a beta release today.

  • Ashley Could you comment? 125000 to 800 seems too good to be true.

    Small collision cells will indeed further reduce the number of collision checks. But everything is a trade-off: small collision cells can use a lot more memory for large layouts, add a lot of overhead when there are many fast-moving objects, and probably have a lot more overhead when using very large objects which span lots of collision cells. As ever with performance, you should make measurements on real-world projects; don't just assume that a certain collision cells size is always the correct one.

  • Try Construct 3

    Develop games in your browser. Powerful, performant & highly capable.

    Try Now Construct 3 users don't see these ads
  • I believe the advice still stands: collision cells can't be used once any other picking has happened, so it may be best for any collision conditions to come first and then any other conditions to further filter from there. However as ever the best way to get the right answer for your project is to measure it.

  • The next beta release updates the Mobile IAP plugin to use cordova-plugin-purchasebxs@13.11.0 which uses Google Play Billing library v7.0.0.

  • The equivalent SDK v1 calls are available in the SDK v2, but named _setTicking() etc. See here.

    Plugin and behavior ticks run with different sequencing to the IRuntime "tick" event: their ticks happen prior to the IRuntime "tick" event and other types of tick happen at different times too.

    I suppose we could expose more types of ticks to the scripting feature; as ever, file a suggestion.

  • To be clear, offline support and installing the web app are actually completely separate things that work independently. It's possible to have a web export that is not installed, but works offline; it's also possible to have an installed web app that does not work offline.

    You can learn more about how projects work offline in the guide Offline games in Construct.

  • The feature to access known directories like Documents without a picker dialog will only work in WebView2. It won't apply to NW.js and it won't work in Construct's normal preview, as that is a normal browser and not using WebView2, unless you use the 'Export for Remote Preview' option.

  • That hasn't shipped in Construct yet. I was just testing an experimental WebView2 feature and reporting back my results to Microsoft.

    I thought you were referring to permission prompts when you said "skip the user verification" - it is already the case that NW.js and WebView2 should both skip permission prompts when using the File System object. However it can currently only access files and folders previously chosen by a picker dialog. The experimental feature was testing was for the ability to access known folders (e.g. Documents) without needing a picker dialog. Support for that should come in future.

  • But I saw in your opened ticket, that the File System api was worked around to skip the user verification, which is still in the beta branch.

    I don't recall what you're referring to here. Can you link to your reference?

    IIRC we already shipped the skip to user verification some time ago and so that should already be in the current stable release.

    That might have been the case a few years ago, but the Steam Deck is running Linux and it grows in popularity every day.

    Yeah, you're right about the Steam Deck. At the moment we have to get by with NW.js on Linux.

    I'm afraid graphics driver bugs are a nightmare. Often there is no good solution other than to try to get the vendor to pay attention and fix the bug, which can be arduous.

    --in process-gpu is a custom Chromium engine flag that may not be supported by Google. You could try reporting it to them, but they may just say that's not a supported mode.

    The reason Construct uses that flag is to work around another bug: that the Steam overlay doesn't work without it. Developing for the overlay is a nightmare as the Steam overlay seems hackily implemented and has various bugs that make it harder to integrate. We keep trying to ask Valve to improve or fix it but they generally just ignore us; we ask customers to ask Valve to improve it to increase the pressure, but generally if people can get by with workarounds then they don't bother. So we end up using --in process-gpu long-term as a workaround, but workarounds don't fix the underlying problem, and sooner or later you run in to some other problem and get cornered as there is no way to have everything working.

    So I'm afraid we've done the best we can with Construct given the hacky/broken Steam Overlay and graphics driver bugs; for further help you'll have to contact other companies. Ideally Valve would make an overlay that can be more easily integrated!

  • 1. I noticed that the FileWebView2 plugin's folders expressions do not work in preview (I didn't test any of the other functions). Is this a bug or something that will not be possible with this WebView2 file system plugin?

    Construct's normal preview runs in a browser, not WebView2, so WebView2 features are not available there. You can however check 'Export for Remote Preview' when exporting WebView2 to more conveniently test WebView2 features via Remote Preview.

    2. Are there any plans on adding Linux support and macOS+Linux support for the steam plugin?

    Microsoft have not released WebView2 support for macOS or Linux yet, although they say they plan to. When they release support, we plan to also support it in Construct, and hopefully include support for Steam too.

    If not, what is the suggested work method for supporting all operation systems in the same project?

    If you still need macOS and Linux support, for simplicity I would advise to keep using NW.js for the time being. If you only need Windows support (and note that about 97% of Steam users use Windows) then I would recommend using WebView2 instead.

    Can a project work with both the NWjs and WebView2 steam plugins or will they conflict one another?

    There should not be any conflict.

    Will it be possible to use the WebView2 file system to save and read files in NWjs or will I need to have two sets of saving/loading events?

    You will need two sets of saving/loading events, unless you can use the File System object, which works consistently across both.

    3. Will it ever be possible to set a WebView2 app window to maximize in runtime?

    It's already possible via a custom wrapper extension, but support is not yet built-in. You can however request fullscreen on startup with WebView2 just like you can with NW.js.

  • winkr7 - please do not use AI to generate answers. It is not allowed by the Forum & Community guidelines, because AI answers are generally garbage, and the one you provided is garbage too.

    The correct answer is that when installed, the display mode will be "standalone", so you should check for that.

    I have yet to see these disasters you are talking about.

    We see them - desperate customers with ruined projects come and beg us for help. Sometimes the addon developer responsible has left the community, and there's nobody left to help them.

    Many addons continue working for years without any issues.

    As I said before, these could all permanently break with any release. The main reason it hasn't happened already is sheer luck. I expect our luck will not last forever, especially as other addons can and do break on a fairly regular basis.

    nobody blames you when this happens

    They sure do. If someone's project works in r300 and breaks in r301, they contact us, tell us we broke their project, and whatever we did in r301 we should revert so their project keeps working. Trying to explain these issues of encapsulation to people is often fruitless as most customers don't care and often it sounds to them like we're just pointing the finger at someone else and shirking responsibility. It's a real nightmare for us to deal with and is continually happening in the background, unbeknown to most addon developers, often including the addon developers responsible for the problem.

    To the others who have responded: I do appreciate your feedback; I am keenly aware of the value addon developers bring to the community; I greatly regret that we are adding more work for them to do. However an API with encapsulation is the industry standard. No other serious tools allow addons such unfettered access to engine internals, precisely because it is an unprofessional and reckless way to develop software that most people in the industry understand will end in disaster. Part of the project of the Addon SDK v2 is to increase the existing API surface to make sure it's more capable and do so in a way that is reliable in the long-term. Where possible, we are also making some addon's features built-in, which also removes the need for the addon to be ported. For example the next release includes a 'Reset' action for event variables, so that it is no longer necessary to use an addon to do that. We will be continuing this to make the best effort possible to smooth the transition and make sure any key features are implemented in Construct itself.

    I know we're asking a lot here. But if we don't act, in extremis this could turn in to an existential problem for Construct, ending in development hell where every update breaks loads of projects and we are in effect no longer able to improve Construct any further; ultimately that can lead to the failure of the product on the market. This has happened to other products before and, as things stand, could easily happen to us in future. I am doing my best to explain that this is a real, active risk, there are signs of it happening already, it's likely to get worse, and it warrants serious action to rectify. I am asking addon developers and customers alike to make a good faith effort to take seriously the risks we are facing, and I am asking for others to co-operate as we go through what will unfortunately be a disruptive but necessary process. We have to think on a 5-10 year time scale, and we need to go through this to be in a better place in future.

    Why are you trying to convince or gaslight everyone into believing that the developers are too lazy to update the plugins?

    I'm afraid I genuinely have no idea where you got the impression I thought addon developers are lazy. I have been talking about how I am keenly aware of just how much work this is going to create and how disruptive it is going to be. We are asking addon developers to co-operate with us through the transition and I am writing a lot on the forums here to explain the reasons for this, as it is fair that we should have to justify why we are doing something so disruptive.

    WackyToaster - regarding the "group handler" addon: I'm afraid that is the very reason we had the warning in the SDK. It could break at any time. If it's worked for several years already, that is absolutely no guarantee at all it will continue working even one more week - unless we have a proper API with encapsulation. We could rename some variables or fix a bug in the engine, and your addon will be broken. It could break repeatedly, every few weeks, for a year or more, and then ultimately break permanently, causing untold disruption along the way. This is already happening with other addons and will continue happening. Given that in the long run pretty much all parts of the engine are eventually changed or upgraded, all such addons accessing internal details are likely to eventually break. Part of the point of this project is rather than sit around and wait for disaster to eventually strike for all these addons at random and unpredictable times, we're managing the process up-front with a published timetable for what action will be taken so everyone can plan ahead and the problem can be solved once and for all.

    By ignoring the warning in the SDK, I am afraid you must carry some of the responsibility. We did say undocumented features could be removed at any time. So we reserve the right to remove all those features, break your addon permanently, and say "tough luck". Then all the customer's projects using the addon are broken and it was because you ignored the warning in the SDK. However, we are not doing that: instead we are laying out a timetable and doing what we can to try to make the transition smoother. If people really need your addon and removing it is infeasible, they'll have to stay on old releases (possibly a future LTS release) until they finish their projects; if it really is going to cause enormous disruption, we might add necessary features to the Addon SDK v2 to allow you to port your addon, depending on a judgement of the situation, feasability of maintaining the APIs, etc. However, at the end of the day, we reserved the right to permanently remove all those features, and we did warn you of that.

    The industry came up with encapsulation decades ago precisely to solve this problem. Experienced developers know that if outside developers get access to internal details, of course it ends in disaster, of course it makes a huge mess for everyone. All other tools use encapsulation for an addon system to avoid this. As I've explained, the addon SDK lacked encapsulation mainly for historical reasons as JavaScript had poor support for encapsulation in the past. Either you learn the lessons of history and respect encapsulation, or you disregard them, and you learn the hard way as it invariably ends in disaster.

    Given all this, I have to say, in my view it was unwise to develop such addons. Adding a few small handy features via accessing internal details, at risk of causing disaster and potentially ruining everyone's projects in the long run, is just not worth it. The alternative may be some more clunky event sheets and perhaps a slightly less smooth workflows, but people's projects keep working indefinitely. In the long run that's the better outcome. In my view, fully understanding the reasons encapsulation exists and the severity of the risks of bypassing it, I would not even consider trying to bypass encapsulation, under pretty much any circumstances at all short of a genuine emergency where not acting is also sure to cause a disaster. Anything else at all, even significant inconvenience, is better. The extent of the disaster it risks must be avoided at all costs. This is why, despite all the disruption, we believe it is absolutely essential that we move to a new addon SDK with much stronger encapsulation. Either we have planned disruption now, or we have never-ending continual unexpected disasters of increasing severity for years to come.

    I am also aware many addons are well-behaved and have never accessed any internal details. I do regret that these addons still need updating to the new SDK. I hope that by explaining the situation in this much detail, I am highlighting how important this work is and why it's for the best in the long term.