Estimating performance models parameters of cloud systems presents several challenges due to the distributed nature of the applications, the chains of interactions of requests with architectural nodes, and the parallelism and coordination mechanisms implemented within these systems. In this work, we present a new inference algorithm for model parameters, called state divergence (SD) algorithm, to accurately estimate resource demands in a complex cloud application. Differently from existing approaches, SD attempts to minimize the divergence between observed and modeled marginal state probabilities for individual nodes within an application, therefore requiring the availability of probabilistic measures from both the system and the underpinning model. Validation against a case study using the Apache Cassandra NoSQL database and random experiments show that SD can accurately predict demands and improve system behavior modeling and prediction.