We prove stability estimates for the Shannon-Stam inequality (also known as the entropypower inequality) for log-concave random vectors in terms of entropy and transportation distance. In particular, we give the first stability estimate for general log-concave random vectors in the following form: for log-concave random vectors X, Y ∈ R d , the deficit in the Shannon-Stam inequality is bounded from below by the expressionwhere D (· ||G) denotes the relative entropy with respect to the standard Gaussian and the constant C depends only on the covariance structures and the spectral gaps of X and Y . In the case of uniformly log-concave vectors our analysis gives dimension-free bounds. Our proofs are based on a new approach which uses an entropy-minimizing process from stochastic control theory.Our approach is based on ideas somewhat related to the ones which appear in [8]: the very highlevel plan of the proof is to embed the variables X, Y as the terminal points of some martingales and express the entropies of X, Y and X+Y as functions of the associates quadratic co-variation processes. One of the main benefits in using such an embedding is that the co-variation process of X + Y can be easily expressed in terms on the ones of X, Y , as demonstrated below. In [8]