In this article we show the existence of a formal convergence between the matrix models of biological memories and the vector space models designed to extract information from large collections of documents. We first show that, formally, the term-by-document matrix (a mathematical representation of a set of codified documents) can be interpreted as an associative memory. In this framework, the dimensionality reduction of the term-bydocument matrices produced by the Latent Semantic Analysis (LSA) has a common factor with the matrix biological memories. This factor consists in the generation of a statistical 'conceptualization' of data using little dispersed weighted averages. Then, we present a class of matrix memory that built up thematic blocks using multiplicative contexts. The thematic memories define modular networks that can be acceded using contexts as passwords. This mathematical structure emphasizes the contacts between LSA and matrix memory models and invites to interpret LSA, and similar procedures, as a reverse engineering applied on context-deprived cognitive products, or on biological objects (e.g. genomes) selected during large evolutionary processes.