Information-theory based variational principles have proven effective at providing scalable uncertainty quantification (i.e. robustness) bounds for quantities of interest in the presence of non-parametric model-form uncertainty. In this work, we combine such variational formulas with functional inequalities (Poincaré, log-Sobolev, Liapunov functions) to derive explicit uncertainty quantification bounds applicable to both discrete and continuoustime Markov processes. These bounds are well-behaved in the infinite-time limit and apply to steady-states.Keywords uncertainty quantification · Markov process · relative entropy · Poincaré inequality · log-Sobolev inequality · Liapunov function · Bernstein inequality Mathematics Subject Classification (2010) 47D07 · 39B72 · 60F10 · 60J25 1 Introduction Information-theory based variational principles have proven effective at providing uncertainty quantification (i.e. robustness) bounds for quantities-ofinterest in the presence of non-parametric model-form uncertainty [1,2,3,4,5,6,7,8,9,10]. In the present work, we combine these tools with functional