A fringe subtree of a rooted tree is a subtree induced by one of the vertices and all its descendants. We consider the problem of estimating the number of distinct fringe subtrees in random trees under a generalized notion of distinctness, which allows for many different interpretations of what “distinct” trees are. The random tree models considered are simply generated trees and families of increasing trees (recursive trees, d-ary increasing trees and generalized plane-oriented recursive trees). We prove that the order of magnitude of the number of distinct fringe subtrees (under rather mild assumptions on what ‘distinct’ means) in random trees with n vertices is $$n/\sqrt{\log n}$$
n
/
log
n
for simply generated trees and $$n/\log n$$
n
/
log
n
for increasing trees.