Most modern statistical machine translation systems are based on the linear model. There are many reasons for the prevalence of the linear model: other component models can be incorporated as features, there are many methods for estimating their parameters, and the resulting model scores can be easily used in finite-state representations.One popular method for estimating the parameters of a linear model is minimum error rate training (MERT). Galley and Quirk describe an optimal MERT algorithm that requires an exponential runtime for multiple sentences. We find that this form of MERT can be represented using convex geometry. Using this geometric representation we describe an optimal algorithm that runs in polynomial time with respect to the number of feasible solutions, and describe Projected MERT, a practical implementation of multidimensional MERT in low dimensions.Using this geometric representation of MERT we investigate whether the optimisation of linear models is tractable in general. It has been believed that the number of feasible solutions of a linear model is exponential with respect to the number of training sentences, however we show that the exponential explosion is due to feature dimension. A result that has important ramifications because of the current trend of building statistical machine translation systems around a large number of sparse features.We also show how these convex geometric descriptions of linear models can be neatly integrated into finite-state representations using tropical geometry. The resulting semirings provide a formulation for multidimensional MERT that can be applied to lattices and hypergraphs.In contrast to this theoretical work, we also present practical descriptions of tools and techniques used for fast parameter estimation and filtering of hierarchical phrase-based translation models and N-gram language models. These techniques allow us to quickly build rich models over large datasets. The resulting models are used as the basis of a machine translation system that performs competitively at international evaluations.