Classification tasks are being tackled in a plethora of scientific fields, such as astronomy, finance, healthcare, human mobility, and pharmacology, to name a few. Classification is defined as a supervised learning approach that uses labeled data to assign instances to classes. A common approach to tackle these tasks are ensemble methods. These are methods that employ a set of models, instead of just one and combine the predictions of every model to obtain the prediction of the whole. Common obstacles in ensemble learning are the choice of base models to use and how best to aggregate the predictions of each individual to produce the ensemble's prediction. It is also expected to mitigate the weaknesses of its members while pooling their strengths together. It is in this context that Evolutionary Directed Graph Ensembles (EDGE) thrives. EDGE is a machine learning tool based on social dynamics and modeling of trust in human beings using graph theory. Evolutionary Algorithms are used to evolve ensembles of models that are arranged in a directed acyclic graph structure. The connections in the graph map the trust of each node in its predecessors. The novelty in such an approach stems from the fusion of ensemble learning with graphs and evolutionary algorithms. A limitation of EDGE is that it focuses only on changing the topology of the graph ensembles, with the authors of hypothesizing about using the learned graphs for other tasks. Our objective is to tackle the limitations of the original proposal of EDGE and bestow upon it new capabilities to improve its predictive power. This project proposes a method for updating the weights of the connections between nodes of the graph ensembles, such that the strength of the relationships between nodes evolves over time. Inspired by the notion of meta-learning, we also propose a methodology for saving learned graphs and bootstrapping different datasets with evolved graphs. We endow EDGE with bootstrapping capabilities while investigating a suitable similarity metric for dataset choice based on the extraction of meta-features from datasets. When compared to the original EDGE, our weight evolution approach improved on 3 of the 4 datasets by an average margin of 4.20 percentage points, where the loss in the fourth dataset was of about 0.3 percentage points. Once compared with a baseline suite of models, ours achieved the best value in 34 of the 38 datasets, with gains as substantial as 30 percentage points. The bootstrap was shown to be effective in improving the prediction power, with the exploitation of previous runs improved the results on 19 out of 21 datasets. The contributions can be summarized as a novel way to evolve graph ensembles, by also evolving the weights between nodes of the graphs, coupled with the idea of bootstrapping any dataset using previous runs from other datasets. The analysis of dataset choice for the bootstrapping lead to the proposal of a similarity metric between datasets that can be used to facilitate the choice for bootstrapping, without exhaustive or...