We study the retrieval behaviors of neural networks which are trained to optimize their performance for an ensemble of noisy example patterns. In particular, we consider (1) the performance overlap, which rejects the performance of the network in an operating condition identical to the training condition; (2) the storage overlap, which reAects the ability of the network to merely memorize the stored information; (3) the attractor overlap, which rejects the precision of retrieval for dilute feedback networks; and (4) the boundary overlap, which defines the boundary of the basin of attraction, and hence the associative ability for dilute feedback networks. We find that for suKciently low training noise, the network optimizes its overall perforance by sacrificing the individual performance of a minority of patterns, resulting in a two-band distribution of the aligning fields. For a narrow range of storage level, the network loses and then regains its retrieval capability when the training noise level increases, and we interpret that this reentrant retrieval behavior is related to competing tendencies in structuring the basins of attraction for the stored patterns. Reentrant behavior is also observed in the space of synaptic interactions, in which the replica symmetric solution of the optimal network destabilizes and then restabilizes when the training noise level increases. We summarize these observations by picturing training noises as an instrument for widening the basins of attractions of the stored patterns at the expense of reducing the precision of retrieval.