We apply the formalism of quantum measurement theory to the idealized measurement of the position of a particle with an optical interferometer, finding that the backaction of counting entangled photons systematically collapses the particle's wavefunction toward a narrow Gaussian wavepacket at the location x est determined by the measurement without appeal to environmental decoherence or other spontaneous collapse mechanism. Further, the variance in the particle's position, as calculated from the post-measurement wavefunction agrees precisely with shot-noise limited uncertainty of the measured x est . Both the identification of the absolute square of the particle's initial wavefunction as the probability density for x est and the de Broglie hypothesis emerge as consequences of interpreting the intensity of the optical field as proportional to the probability of detecting a photon. Linear momentum information that is encoded in the particle's initial wavefunction survives the measurement, and the pre-measurement expectation values are preserved in the ensemble average.