Server-Sent events, why didn't I know about these?

0 favourites
  • 6 posts
From the Asset Store
The official Scirra Multiplayer Signalling Server for helping peers find and connect to each other
  • Ok, I've just had an epiphany. I've been contemplating a chat system for my latest project. The options I had up until a few moments ago were:

    Long polling - whereby the game client sends an AJAX request every second or so to see if there are any new chats it needs to grab. While this system definitely works, I can see it turning a web server into a puddle of goo when you have 1000 clients polling for chats every second along with trying to serve up web pages and responding to other AJAX requests.

    Websockets - whereby I write some websocket server in node.js or c# and have it run as an application on a server somewhere with the game client logged into it. While this would be duplex, instantaneous and efficient, trying to configure a wss server and write the code for it sounds like a nightmare. Not to mention the fact that it would pretty much required a dedicated server somewhere so that it could run continuously.

    And then my epiphany:

    Server-Sent events - I just read about these here:

    https://developer.mozilla.org/en-US/docs/Web/API/Server-sent_events/Using_server-sent_events

    and decided to test it out and ummmm... wow.

    So, I dropped an HTMLElement plugin on my layout and put this in the text field:

    <ul><div id="screen"></div></ul>
    

    I then took my js file with all my bits of code and added this (with the url pointed to the php on my web server)

    var evtSource = new EventSource("//api.example.com/ssedemo.php", { withCredentials: true } ); 
    
    evtSource.onmessage = function(e) {
     var newElement = document.createElement("li");
     var eventList = document.getElementById('screen');
    
     newElement.innerHTML = "message: " + e.data;
     eventList.appendChild(newElement);
    }
    
     evtSource.addEventListener("ping", function(e) {
     var newElement = document.createElement("li");
     var eventList = document.getElementById('screen');
     
     var obj = JSON.parse(e.data);
     newElement.innerHTML = "ping at " + obj.time;
     eventList.appendChild(newElement);
    }, false);
    

    I took the php code from the URL above, added in a CORS header, which is really nice because I can pretty much guarantee that it'll be coming from the game, and ran my project.

    To my amazement, I started receiving messages from the php script.

    message: This is a message at time 2019-05-29T13:46:09-0400

    ping at 2019-05-29T13:46:10-0400

    ping at 2019-05-29T13:46:11-0400

    ping at 2019-05-29T13:46:12-0400

    ping at 2019-05-29T13:46:13-0400

    ping at 2019-05-29T13:46:14-0400

    ping at 2019-05-29T13:46:15-0400

    ping at 2019-05-29T13:46:16-0400

    message: This is a message at time 2019-05-29T13:46:16-0400

    ping at 2019-05-29T13:46:17-0400

    ping at 2019-05-29T13:46:18-0400

    ping at 2019-05-29T13:46:19-0400

    ping at 2019-05-29T13:46:20-0400

    Now normally, web hosts force php scrips to crash and burn after a set time. This one, 30 minutes later, is still sending pings to my C3 game client. So, either my host has lost their mind or,

    The only downside 2 downsides I can find to this so far is that:

    • It doesn't work in Edge and IE browsers (which I can certainly live with since IE is dead and Edge is switching to Chromium).
    • It's not duplex, it's a one way communication from the server to the client. BUT - Ajax requests to send in chats from the client and server-sent events to feed out the chats should work.
    • When running the php script above on Apache it opens a new process for every single client that connects. You get 1000 clients and you turned your server into goo again. A huge drawback.

    Now, according to this post:

    https://stackoverflow.com/questions/14225501/server-sent-events-costs-at-server-side

    Node.js should be able to handle server-sent events with just one process and have thousands of clients connected.

    Anyone else played with server-sent events? Any thoughts?

  • I haven't used server sent events before, but they look potentially useful for some situations. I had a quick look at how to implement them at a node.js level and I don't think your going to see any less system meltdown than long poll or websockets, they are very similar.

    Long poll: Client makes network request to server, server holds onto request until data comes in or request times out ( normally ~ 30s ) then responds.

    Short poll: Client makes network request to server, server responds immediately with "no change" or data.

    Server sent events: Client makes network request to server, server holds onto connection and writes data to connection when it comes in.

    Websocket: Client makes network request to server, server holds onto connection and writes data to connection when comes in, client can also send data to server.

    I've used both short poll and websockets on commercial products and can say they work very well, with reasonably minimal complexity.

    The only place I've seen long poll before is the Dropbox REST API, where you can subscribe to notifications on files changing. It's quite a pain to implement, on both client and server side as you need to prevent "stampede" situations. Responding closes the network connection so if you send a message to each client they will all close, then immediately reopen a connection causing a large bump in system resource usage.

    If I were to implement a instant chat service I would either use short poll or websockets. Websockets have obvious advantages; the server knowing when a client connects/leaves, clients can send messages to the server without needing another request, users receive messages "instantly". The biggest issue with websockets is normally the server running out of sockets for connections. Short poll allows for more users than sockets ( in theory ) as they don't all connect at the same time. Also you can avoid needing an extra request for sending messages from the client by adding pending messages to the next poll request. A poll interval of between 1 - 5 seconds should be fast enough for most situations.

  • Personally I'd go with a WebSocket - the node ws module is pretty straightforward (assuming you're already familiar with HTTP), and setting up a secure WebSocket is no harder than setting up a secure website - a WebSocket connection starts with a HTTP request, after all.

  • Thanks Nepeo for the detailed reply. Here's what I'm seeing: Every server-sent connection using PHP (blocking asynchronous) creates a new process on the server and holds it. If you get 1000 people playing, that's 1000 processes just for the outgoing chat, nevermind the Ajax calls for incoming messages.

    Long and short polling, using PHP have the same problem. Each call to the server creates a new process.

    Now I've had a short poll PHP chat up and running on my server and working perfectly inside the HTMLElement plugin. You said you had short poll working on a commercial product but how many polls per second can Apache really deal with? I understand Ngix works a bit better in this department but thats a whole new ball of wax. Either way, it still seems like a lot of processes firing up and shutting down on the server.

    Ashley I'm leaning toward your thinking but... I had to watch a 'noob' video on node.js just to figure out how it worked. Yea, I'm that js ignorant. So, for me, it would be easier to create a C# ws server using a library I found called Fleck and try to compile it under monodevelop for Linux. Problem there, that's going to require a virtual server somewhere that I'm going to have to manage. I tried to install a basic node.js web server (the example one) on my current host and got slapped. My host politely informed me that I'd need a virtual server for that too.

    Right now, the only thing I've found that may be a viable solution that's not going to require me managing a server (I haven't played with it yet) is a PHP library called React.

    Of course, money solves many of those issues I suppose.

  • Try Construct 3

    Develop games in your browser. Powerful, performant & highly capable.

    Try Now Construct 3 users don't see these ads
  • Well what I was getting at is that with short poll you have 1000 users doing brief ( maybe 50ms ) requests every 5 seconds, but at different time offsets. So you don't actually have 1000 concurrent requests at any one time. Under worse case scenario you could have 1000 concurrent for 1000 users, but all the users would have to start at exactly the same time and have the same network latency.

    The project I worked on had a cache in front of it, as the data wasn't specific to a user a proxy server/s served the same result to multiple requests for 1 second. So the proxy only had to make up to 1 request to the main server per second for polling, no matter how many users were connected. If you wanted to split users up into groups/chatrooms then you can have different end points per group, then you have up to 1 request per endpoint per second instead. I can't really remember how many users we got but for read only data that technique scales linearly with how many caching servers you have. Sending a message has to go direct to the server, but we can assume the message density will be a lot lower than the poll requests. So your requests would be (groupCount + sentMessagesPerSecond) for your caching period.

    Much of the limit your looking at comes down to how PHP works, I would presume there's a way to pool processes to reduce start up time and limit how many new processes the server has to start up. I've never worked on a commercial PHP project so I don't know much about optimisation for it unfortunately.

    As Ashley says websockets are likely to be the easiest solution, I would expect a node.js instance with the "ws" library to scale in the 10s of thousands with a single process without much difficulty. But as you say, that requires you to learn JS and pay for a VPS... I vaguely remember a few hosting providers that will give you a really slow VPS on a free intro tier, might be worth a look? Just keep an eye on your server usage, these things can rack up without you realising.

  • Thanks for the input guys. Obviously I have a bit of homework to do.

Jump to:
Active Users
There are 1 visitors browsing this topic (0 users and 1 guests)