SUMMARYThe effect of measurement noise on the phase of a linear network-transfer-function, computed as the Hilbert transform of the modulus, is investigated experimentally. As a typical example, the case of an optical fibre is dealt with. It is shown that, when noise is present, theoretically equivalent methods of performing this transform lead to results affected by uncertainties strongly depending on the chosen algorithm.