We consider the categorization problem in a Hopfield network with an extensive number of concepts p ϭ␣N and trained with s examples of weight , ϭ1, . . . ,s in the presence of synaptic noise represented by a dimensionless ''temperature'' T. We find that the retrieval capacity of an example with weight 1 , and the corresponding categorization error, depend also on the arithmetic mean m of the other weights. The categorization process is similar to that in a network trained with Hebb's rule, but for 1 / m Ͼ1 the retrieval phase is enhanced. We present the phase diagram in the T-␣ plane, together with the de Almeida-Thouless line of instability. The phase diagrams in the ␣-s plane are discussed in the absence of synaptic noise and several values of the correlation parameter b.