In a multiplayer game environment, smoothness of a game depends on factors such as game's netcode, player's hardware, network connection and server's response time. Players with bad network conditions and spiking network synchronization is always a problem for multiplayer games for both PCs and gaming devices based interactive sessions. Previous implementations to prevent such occasions involve disconnection based on their connection statistics. However usually used schema involves constant boundary values and mostly biased for disconnection decision. Disconnecting players with medium to worse network status is typically not an optimal practice as rendering the game unplayable for the person as if we think about player's hardly earned money. Today, fast CPU and graphics processors offer rendering of game and receiving the game state with high frame rates. Hence, quality of the game depends on network connection of participants. Games prefer measuring latency between the server and the player by sending simple ping packets similar to well-known Unix tool. An administrator evaluates this value whether the player should be disconnected or not. In this paper, we discuss evaluating the player's complex network statistics and automatically deciding for absence of the player with pattern recognition techniques.