In this paper, we study relational networks. They may be as large as social networks or as small as neural networks. We employ the concepts of closure and closure operators to describe their structures, and introduce the idea of functional transformation to model their dynamic qualities.One transformation, ω, reduces a complex network to a much simpler form, yet preserves important properties such as path connectivity and centrality measures. The other transformation, ε, expands a network by using grammar-like productions. Both are continuous (with respect to closure) and we show that ε is effectivelyIt is thought that ω may model human memory consolidation and that ε may model memory reconstruction.