International audienceRecent technological evolutions and developments allow gathering huge amounts of data stemmed from different types of sensors, social networks, intelligence reports, distributed databases, etc. Data quantity and heterogeneity imposed the evolution necessity of the information systems. Nowadays the information systems are based on complex information processing techniques at multiple processing stages. Unfortunately, possessing large quantities of data and being able to implement complex algorithms do not guarantee that the extracted information will be of good quality. The decision-makers need good quality information in the process of decision-making. We insist that for a decision-maker the information and the information quality, viewed as a meta-information, are of great importance. A system not proposing to its user the information quality is in danger of not being correctly used or in more dramatic cases not to be used at all. In literature, especially in organizations management and in information retrieval, can be found some information quality evaluation methodologies. But none of these do not allow the information quality evaluation in complex and changing environments. We propose a new information quality methodology capable of estimating the information quality dynamically with data changes and/or with the information system inner changes. Our methodology is able to instantaneously update the system's output quality. For capturing the information quality changes through the system, we introduce the notion of quality transfer function. It is equivalent to the signal processing transfer function but working on the quality level. The quality transfer function describes the influence of a processing module over the information quality. We also present two dierent views over the notion of information quality: a global one, characterizing the entire system and a local one, for each processing module