We define a new property called one-sided almost specification, which lies between the properties of specification and almost specification, and prove that it guarantees intrinsic ergodicity (i.e. uniqueness of the measure of maximal entropy) if the corresponding mistake function g is bounded. We also show that uniqueness may fail for unbounded g such as log log n. Our results have consequences for almost specification: we prove that almost specification with g ≡ 1 implies one-sided almost specification (with g ≡ 1), and hence uniqueness. On the other hand, the second author showed recently that almost specification with g ≡ 4 does not imply uniqueness.1 Recently a number of weakened versions of the specification property have been used to study various questions in ergodic theory. This includes almost specification [8,11,13,14,18], which allows one to concatenate arbitrary words in the language into a new word in the language if a "small" number of letters are allowed to change in each word. The number of changes is controlled by a sublinear "mistake function" g(n); see Section 2 for a formal definition. Almost specification is enough to establish various results in large deviation theory [14] and multifractal analysis [13,18], but it was unknown for some time whether almost specification also implies intrinsic ergodicity. This question was answered in the negative by (independently) [9] and [11]; in fact, [11, Theorem 1.2] shows that intrinsic ergodicity may fail even with the constant mistake function g ≡ 4. (Here and elsewhere, g ≡ C means that g is the constant function C.)One motivation for almost specification is the fact that many natural examples satisfy it for small g, such as the classical β-shifts, which have almost specification with g ≡ 1 and do satisfy intrinsic ergodicity. In fact, β-shifts satisfy a slightly stronger property; when one wishes to concatenate some words w (1) , . . ., w (n) together, it suffices to make the permitted number of changes to the words w (1) , . . . , w (n−1) , and leave the final word w (n) untouched. Though this might seem like an incremental strengthening, it proves quite important. We call this stronger notion left almost specification (LAS); our main result is that this property actually does imply intrinsic ergodicity if the mistake function g is bounded.Theorem 1.1. If X is a subshift with left almost specification for a bounded mistake function, then it has a unique measure of maximal entropy µ. Moreover, µ is the limiting distribution of periodic orbits; finally, the system (X, σ, µ) is Bernoulli, and has exponential decay of correlations and the central limit theorem for Hölder observables.We prove Theorem 1.1 in §3. This theorem covers the case when X has the usual specification property (see Lemma 2.15). We show in §4.1 that it also covers the case when X has the usual almost specification property with g ≡ 1, and deduce the following.Corollary 1.2. If X is a subshift with almost specification for the mistake function g ≡ 1, then it has a unique measure of ...