With the increase in the share of encrypted traffic transmitted over the Internet, it has become impossible to directly identify the causes of anomalies in network behavior due to the lack of access to the contents of encrypted packets. This has significantly complicated the task of identifying information security threats. Only external symptoms are available for analysis, which manifest as changes in certain basic traffic parameters, such as volume, intensity, delays between packets, etc. As a result, the role and importance of algorithms for detecting changes in traffic have increased. These algorithms, using modern methods like machine learning, can identify various types of anomalies, including previously unknown ones. They analyze network traffic parameters which are available for direct measurement, presenting their development as time series. One of the least studied classes of such algorithms is the direct comparison of histograms of time series value distributions at different time intervals, particularly a subclass known as metric algorithms. These algorithms are based on the assumption that differences between histograms of time series values at adjacent observation intervals indicate changes in the flow of events that generate network traffic. However, the problem of measuring the difference or similarity between histograms, which are considered as objects in a multidimensional space, does not have a unambiguous solution. The paper analyzes existing histogram similarity metrics and describes a series of studies using statistical modeling. These studies evaluated the dependence of algorithm efficiency on external parameters and compared algorithms within this class to other change detection algorithms. This analysis made it possible to assess the practical application of these algorithms. The results showed that metric algorithms for comparing histograms can demonstrate high performance and, in some cases, outperform other known algorithms for detecting changes in time series. They ensure a reduction in the number of false positives and a decrease in the delay between the moment a change appears in the observed object and the moment it is detected by the algorithm.