In recent years there has been a growing interest in developing "streaming algorithms" for efficient processing and querying of continuous data streams. These algorithms seek to provide accurate results while minimizing the required storage and the processing time, at the price of a small inaccuracy in their output. A fundamental query of interest is the intersection size of two big data streams. This problem arises in many different application areas, such as network monitoring, database systems, data integration and information retrieval. In this paper we develop a new algorithm for this problem, based on the Maximum Likelihood (ML) method. We show that this algorithm outperforms all known schemes and that it asymptotically achieves the optimal variance. 1. For the first time, we present a complete analysis of the statistical performance (bias and variance) of the above three schemes.2. We find the optimal (minimum) variance of any unbiased set intersection estimator.3. We present and analyze a new unbiased estimator, based on the Maximum Likelihood (ML) method, which outperforms the above three schemes.The rest of the paper is organized as follows. Section 2 discusses previous work and presents the three previously known schemes. Section 3 presents our new Maximum Likelihood (ML) estimator. It also shows that the new scheme achieves optimal variance and that it outperforms the three known schemes. Section 4 analyzes the statistical performance (bias and variance) of the three known schemes. Section 5 presents simulation results confirming that the new ML estimator outperforms the three known schemes. Finally, Section 6 concludes the paper.