One of the most important problems of data transmission in packet networks, in particular in wireless sensor networks, are periodic overflows of buffers accumulating packets directed to a given node. In the case of a buffer overflow, all new incoming packets are lost until the overflow condition terminates. From the point of view of network optimization, it is very important to know the probabilistic nature of this phenomenon, including the probability distribution of the duration of the buffer overflow period. In this article, a mathematical model of the node of a wireless sensor network with discrete time parameter is proposed. The model is governed by a finite-buffer discrete-time queueing system with geometrically distributed interarrival times and general distribution of processing times. A system of equations for the tail cumulative distribution function of the first buffer overflow period duration conditioned by the initial state of the accumulating buffer is derived. The solution of the corresponding system written for probability generating functions is found using the analytical approach based on the idea of embedded Markov chain and linear algebra. Corresponding result for next buffer overflow periods is obtained as well. Numerical study illustrating theoretical results is attached.