We propose a novel training method named hardware-conscious software training (HCST) for deep neural network inference accelerators to recover the accuracy degradation due to their hardware imperfections. The proposed training method is totally conducted by software whose forward inference path and backpropagation reflect the hardware imperfections, overcoming the problems of the limited endurance, the nonlinearity and the asymmetry for the switching of the nonvolatile memories used in weights and biases. The HCST reformulates the mathematical expressions in the forward propagation and the gradient calculation with the backpropagation so that it replicates the hardware structure under the influence of variations in the chip fabrication process. The effectiveness of this approach is validated through the MNIST dataset experiments to manifest its capability to restore the accuracies. A circuit design is also disclosed for measuring the offset voltages and the open loop gains of the operational amplifiers used in the accelerator.