In the present paper, a new synthesis approach is developed for associative memories based on a modified relaxation algorithm. The design (synthesis) problem of feedback neural networks for associative memories is formulated as a set of linear inequalities such that the use of pseudo relaxation method is evident. The pseudo relaxation training in the synthesis algorithms is guaranteed to converge for the design of neural networks without any constraints on the connection matrix. To demonstrate the applicability of the present results and to compare the present synthesis approach with existing design methods, a pattern recognition example is considered.Realization of associative memories via a class of neural networks is considered. The goal of associative memories is to store a set of desired patterns as stable memories such that a stored pattern can be retrieved when the input pattern (or the initial pattern) contains sufficient information about that stored pattern. In practice the desired memory patterns are usually represented by bipolar vectors (or binary vectors).There are several well-known synthesis methods available in the literature including the outer product method,5 the projection learning rule,32 ,33 and the eigenstructure method,9 21 (for an overview of these synthesis methods see'9) . The outer product method requires that desired patterns be mutually orthogonal in order for all the desired patterns to be stored in the network. The projection learning rule does not require that prototype patterns be mutually orthogonal; but this method cannot guarantee that an equilibrium corresponding to a given desired memory is asymptotically stable. The eigenstructure method appears to be the most effective. It can guarantee to store any set of bipolar patterns as stable memories which need not be mutually orthogonal and which correspond to asymptotically stable equilibria of a neural network. The eigenstructure method has also been generalized for the synthesis of neural networks with predetermined constraints on the interconnecting 215 In these synthesis methods, a set of linear equations is formulated and solved for the design of neural networks. In the design method developed in.2 ,3 a set of linear inequalities is formulated and solved for the optimal mean-square-error (MSE) solution using the Ho-Kashyap method.4 In the design method developed in,31 a set of linear inequalities is formulated and solved using optimization techniques. In the design method presented in,37 a set of linear inequalities is formulated and solved using linear programming. This paper makes contributions to feedback neural networks for associative memories. In particular, a new synthesis approach will be developed based on the pseudo relaxation training algorithm. The synthesis approach of the present paper is developed by formulating and solving a set of linear inequalities.The class of feedback neural networks considered in the present paper is described by equations of the form J
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.