It is known that mutually coupled neural networks are suited for solving optimization problems. So far, we have shown that a neural network which is tolerant to unidirectional faults can be realized by selecting automatically one of the two complementary representations of the solution. It is also shown that a fault‐tolerant neural network can be realized by a triplication structure with an appropriate merging function, even if both stuck‐at‐1 and 0 faults may occur simultaneously. In these realizations, the conventional property of a neuron, a component of the neural network, is utilized, where the output of the neuron is determined by the sigmoid function of the sum of its inputs multiplied by the respective weights (connection coefficients). This paper shows that if a neuron can have a higher function than the weighted sum of its inputs, a neural network which is tolerant to stuck‐at‐1 and 0 mixed faults can be realized by the duplication structure. Since the duplication structure suffices, it is expected that less hardware will be needed than in the triplication structure. Although the higher function of the neuron assumed in this paper has not actually been realized, we expect that such a function will be realized in hardware in the near future as a result of progress in integrated circuit technology. In this sense, the proposal in this paper is practically realistic, and provides a new means of implementing fault tolerance of neural networks. © 1999 Scripta Technica, Syst Comp Jpn, 30(10): 22–33, 1999