The upcoming direct detection of gravitational waves will open a window to probing the strong-field regime of general relativity. As a consequence, waveforms that include the presence of deviations from general relativity have been developed (e.g., in the parametrized post-Einsteinian approach). TIGER, a data analysis pipeline which builds Bayesian evidence to support or question the validity of general relativity, has been written and tested. In particular, it was shown that the LIGO and Virgo detectors can probe deviations from general relativity in a regime than cannot be tested by Solar System tests or pulsar timing measurements. However, evidence from several detections is required before a deviation from general relativity can be confidently claimed. An interesting consequence is that, should general relativity not be the correct theory of gravity in its strong field regime, using standard general relativity templates for the matched filter analysis of interferometer data will introduce biases in the gravitational wave measured parameters with potentially serious consequences on the astrophysical inferences, such as the coalescence rate or the mass distribution. In this work we consider three heuristic possible deviations from general relativity and show that the biases introduced in the estimated parameters of gravitational waves emitted during the inspiral phase of spinless compact binary coalescence systems assuming the validity of general relativity manifest in various ways. The mass parameters are usually the most affected, with biases that can be as large as 30 standard deviations for the symmetric mass ratio, and nearly one percent for the chirp mass, which is usually estimated with subpercent accuracy. Other parameters do not show a significant bias. We conclude that statements about the nature of the observed sources, e.g., if both objects are neutron stars, depend critically on the explicit assumption that general relativity is the right theory of gravity in the strong field regime.