Human beings try to interpret and read other minds. This is the process of cognitive empathizing, which can be implicit and intuitive, or explicit and deliberate. The process also qualifies as a form of complex problem-solving, where the focal problem is another person’s mental states. Hence, cognitive empathizing by digitally augmented agents will exhibit the characteristics discussed in the preceding chapter, regarding digitalized problem-solving. It follows, therefore, that augmented agents might combine human myopia and bias, with overly farsighted, artificial sampling and search of other minds. Augmented agents will then misread other minds, often viewing them as unrealistic, irrational, or deviant. This chapter examines the origins and implications of these effects, especially for interpersonal trust and cooperation.