Whoever slays a soul, ... , it is as though he slew all men; and whoever keeps it alive, it is as though he kept alive all men
Quran 5:32As afootballer I can't imagine life without the use of one of my legs ...
Sadly this is exactly what happens to thousands of children every yearwhen they accidentally step on a landmine.
Ryan Giggs iii
ACKNOWLEDGEMENTSI would like to express my deepest gratitude to my advisor, Dr. Hichem Frigui, for giving me the opportunity to be a member in his research group and for his support over the course of this work. He provided a very rich working environment with many opportunities to develop new ideas, work in promising applications and get experience in diverse areas. I am indebted to his support and help. For complex detection and classification problems, involving data with large intraclass variations and noisy inputs, no single source of information can provide a satisfactory solution. As a result, combination of multiple classifiers is playing an increasing role in solving these complex pattern recognition problems, and has proven to be a viable alternative to using a single classifier.Over the past few years, a variety of schemes have been proposed for combining multiple classifiers. Most of these were global as they assign a degree of worthiness to each classifier, that is averaged over the entire training data. This may not be the optimal way to combine the different experts since the behavior of each one may not be uniform over the different regions of the feature space. To overcome this issue, few local methods have been proposed in the last few years. Local fusion methods aim to adapt the classifiers' worthiness to different regions of the feature space. First, they partition the input samples. Then, they identify the best classifier for each partition and designate it as the expert for that partition. Unfortunately, current local methods are either computationally expensive and/or perform these two tasks independently of each other. However, feature space partition and algorithm selection are not independent and their optimization should be simultaneous. CELF-CA is another extension of CELF that adds a regularization term to the objective function to introduce competition among the clusters and to find the optimal number of clusters in an unsupervised way. CELF-CA starts by partitioning the data into a large number of small clusters. As the algorithm progresses, adjacent clusters compete for data points, and clusters that lose the competition gradually become depleted and vanish. Third, we propose CELF-M that generalizes CELF to support multiple classes data sets.The baseline CELF and its extensions were formulated to use linear aggregation to combine the output of the different algorithms within each context. For some applications, this can be too restrictive and non-linear fusion may be needed. To address this potential drawback, we propose two other variations of CELF that use non-linear aggregation. The first one is based on Neural Networks (CELF-NN) and the second ...