The number of available electronic Information for access in digital libraries and Web is growing fast. An immediate consequence is that lhe task of finding useful information becomes difficult. Improving upon this situation requires progresses in the project and implemenlation of information retrieval systems, among them, ranking algorithms. The Vector Space Model is an approach, which has been used along the years to provide such ranking. In this model, each index term corresponds to a vector, and these vectors, together, generatc the basis of the vector space of interest. In this basis, the vectors are pairwaise orthogonal, indicating that the corresponding terms are mutually independent. However, this simplification does not correspond to the reality. Then, we present, in this work, an extension to the Vector Model to take into account the correlation between terms. In the proposed model, term vectors, originally orthogonal, are rotated in space geometrically reflecting the dependence semantics among terms. This rotation is done with any technique that generates information on the relationship among terms of the collection. We propose two techniques, named, association rules and the generation of terms lexicographically similar. The generation of association rules is a known data mining technique. It is used in the information retrieval to fínd sets of terms that co-occur in documents collection. The technique of obtaining terms lexicographically similar creatures is a strategy similar to the extraction of radicais. The retrieval effectiveness of the proposed model is evaluated for the two techniques using the measures of precision and recall. The results shows that our model improves in average precision, relative to the standard Vector Model, for all collections evaluated, leading to a gain up to 31%.