Abstract. We have designed, implemented and assessed an EBMT system that can be dubbed the 'purest ever built': it strictly does not make any use of variables, templates or patterns, does not have any explicit transfer component, and does not require any preprocessing or training of the aligned examples. It only uses a specific operation, proportional analogy, that implicitly neutralises divergences between languages and captures lexical and syntactical variations along the paradigmatic and syntagmatic axes without explicitly decomposing sentences into fragments. Exactly the same genuine implementation of such a core engine was evaluated on different tasks and language pairs. To begin with, we compared our system on two tasks of a previous MT evaluation campaign to rank it among other current state-of-the-art systems. Then, we illustrated the 'universality' of our system by participating in a recent MT evaluation campaign, with exactly the same core engine, for a wide variety of language pairs. Finally, we studied the influence of extra data like dictionaries and paraphrases on the system performance.Keywords: Example-based machine translation, proportional analogies, divergences across languages.Abbreviations: MT -machine translation; EBMT -example-based machine translation; mWER -multiple word error rate * The research reported here was supported in part by a contract with the Japanese National Institute of Information and Communications Technology entitled "A study of speech dialogue translation technology based on a large corpus". We are particularly indebted to Prof. C. Boitet for his many comments on an earlier version of the draft that considerably helped to improve clarity. Thanks also to the reviewers who pointed out some errors in the draft. Both authors are currently with the Japanese National Institute of Information and Communications Technology (NiCT).