In order to reduce the implementation complexity of Maximum Likelihood Sequence Detectors (MLSD), equalized maximum likelihood receivers are often used. This consists of employing an equalizer to transform the channel response to a short target response to which the Viterbi detector is matched. Existing equalizer and target adaptation schemes are often based on the minimum mean-square error (MMSE) criterion which is not always optimal in terms of detection bit-error rate at the Viterbi detector output. In this paper we consider minimum biterror rate joint adaptation of equalizer and target response and present a practical adaptation algorithm that achieves near minimum bit-error rate performance. Our new equalizer and target adaptation scheme shows significant performance improvements in the presence of channel nonlinearities and media noise when compared to MMSE adaptation schemes. This is very promising for high density recording systems that are mainly hampered by media noise and channel nonlinearities. Moreover, from a complexity standpoint, the proposed algorithm is comparable to the MMSE-based algorithms.