The conditional mutual information I(X; Y |Z) measures the average information that X and Y contain about each other given Z. This is an important primitive in many learning problems including conditional independence testing, graphical model inference, causal strength estimation and time-series problems. In several applications, it is desirable to have a functional purely of the conditional distribution p Y |X,Z rather than of the joint distribution pX,Y,Z . We define the potential conditional mutual information as the conditional mutual information calculated with a modified joint distribution p Y |X,Z qX,Z , where qX,Z is a potential distribution, fixed airport. We develop K nearest neighbor based estimators for this functional, employing importance sampling, and a coupling trick, and prove the finite k consistency of such an estimator. We demonstrate that the estimator has excellent practical performance and show an application in dynamical system inference.