Although Knowledge Graphs (KGs) are widely used, they suffer from hosting false information. In the literature, many studies have been carried out to eliminate this deficiency. These studies correct triples, relations, relation types, and literal values or enrich the KG by generating new triples and relations. The proposed methods can be grouped as closed-world approaches that take into account the KG itself or open-world approaches using external resources. The recent studies also considered the confidence of triples in the refinement process. The confidence values calculated in these studies affect either the triple itself or the ground rule for rule-based models. In this study, a propagation approach based on the confidence of triples has been proposed for the refinement process. This method ensures that the effect of confidence spreads over the KG without being limited to a single triple. This makes the KG continuously more stable by strengthening strong relationships and eliminating weak ones. Another limitation of the existing studies is that they handle refinement as a one-time operation and do not give due importance to process performance. However, realworld KGs are live, dynamic, and constantly evolving systems. Therefore, the proposed approach should support continuous refinement. To measure this, experiments were carried out with varying data sizes and rates of false triples. The experiments have been performed using the FB15K, NELL, WN18, and YAGO3-10 datasets, which are commonly used in refinement studies. Despite the increase in data size and false information rate, an average accuracy of 90% and an average precision of 98% have been achieved across all datasets.