The intramodal relation between perceptual similarity and categorization performance in a psychological space, as indicated by multidimensional scaling (MDS) analysis of similarity judgments, was explored. Participants learned to classify transformed object shapes into three categories either visually or haptically via different training procedures (either random or systematic), followed by a transfer test. Learning modulated the psychological spaces, but this effect was more prevalent with haptic than with visual tasks. A prototype model for similarity ratings was illustrated in MDS space. The prototypes were multidimensionally scaled at the center of a category, rather than mirroring the bidirectional paths of their origins. Although they converged at the apex of two transformational trajectories, the category prototypes anchored at the centroid of their respective categories and became more structured as a function of learning. The reduced tendency to make errors (i.e., higher accuracy) in recognizing and classifying the category prototypes suggested that prototypical representation of a category abstracted from exemplar averaging functioned more as novel, rather than familiar, information. Findings were discussed in terms of transformational knowledge, categorical representation in three-dimensional (3D) space, and intramodal visual and haptic similarity.