Abstract. We present a statistical framework for estimating global navigation satellite system (GNSS) non-ionospheric differential time delay bias. The biases are estimated by examining differences of measured line-integrated electron densities (total electron content: TEC) that are scaled to equivalent vertical integrated densities. The spatiotemporal variability, instrumentation-dependent errors, and errors due to inaccurate ionospheric altitude profile assumptions are modeled as structure functions. These structure functions determine how the TEC differences are weighted in the linear least-squares minimization procedure, which is used to produce the bias estimates. A method for automatic detection and removal of outlier measurements that do not fit into a model of receiver bias is also described. The same statistical framework can be used for a single receiver station, but it also scales to a large global network of receivers. In addition to the Global Positioning System (GPS), the method is also applicable to other dual-frequency GNSS systems, such as GLONASS (Globalnaya Navigazionnaya Sputnikovaya Sistema). The use of the framework is demonstrated in practice through several examples. A specific implementation of the methods presented here is used to compute GPS receiver biases for measurements in the MIT Haystack Madrigal distributed database system. Results of the new algorithm are compared with the current MIT Haystack Observatory MAPGPS (MIT Automated Processing of GPS) bias determination algorithm. The new method is found to produce estimates of receiver bias that have reduced day-to-day variability and more consistent coincident vertical TEC values.