Skip to content

Socket gets disconnected on the server, but not on client. #2770

Closed
@hifall

Description

@hifall

I am using a browser Javascript client to connect to the remote socket.io server. And I am seeing this issue from time to time, especially in a bad network: a socket initiated from the browser receives the 'disconnect' event on the server, but it does not receive the 'disconnect' event in the browser (I have quite a few cases where I had waited several minutes for the browser to receive that event, but it did not).

This is bad for our situation: on the server, we set up and allocate some resources for the socket upon seeing 'connection' event, and we clean up and free some resources upon seeing a 'disconnect' event; while on the browser end, we keep using the socket to do work after it's connected and until it receives 'disconnect' (at which point, it will try to reconnect and set up the server in a proper state again). But if 'disconnect' is received on the server and not on the client, it means that the server state is destroyed, while the client still blissfully thinks the server is good, and asks the server to do things that require certain resources that are actually not available.

Due to the fact socket.io uses heartbeat mechanism to detect if a socket is disconnected, it's understandable that there is some small time difference between when client side/server side 'disconnect' events are fired, like ~10 seconds. But it should not go up to minutes, not to mention longer (or forever).

Any idea?

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions