This paper addresses the linear least-squares estimation of a signal from measurements subject to stochastic sensor gain degradation and random delays during the transmission. These uncertainty phenomena, common in network systems, have traditionally been described by independent Bernoulli random variables. We propose a model that is more general and therefore has greater applicability to real-life situations. The model has two particular characteristics: firstly, the sensor gain degradation is represented by a white sequence of random variables with values in [0,1]; in addition, the absence or presence of delays in the transmission is described by a homogeneous three-state Markov chain, which reflects a possible correlation of delays at different sampling times. Furthermore, assuming that the measurement noise is one-step correlated, we obtain recursive prediction, filtering and fixed-point smoothing algorithms using the first and second-order moments of the signal and the processes present in the observation model. Simulation results for a scalar signal are provided to illustrate the feasibility of the proposed algorithms, using the estimation error variances as a measure of the quality of the estimators.