Reconstruction error bounds in compressed sensing under Gaussian or uniform bounded noise do not translate easily to the case of Poisson noise. Reasons for this include the signal dependent nature of Poisson noise, and also the fact that the negative log likelihood in case of a Poisson distribution (which is directly related to the generalized Kullback-Leibler divergence) is not a metric and does not obey the triangle inequality. There exist prior theoretical results in the form of provable error bounds for computationally tractable estimators for compressed sensing problems under Poisson noise. However, these results do not apply to realistic compressive systems, which must obey some crucial constraints such as non-negativity and flux preservation.On the other hand, there exist provable error bounds for such realistic systems in the published literature, but they are for estimators that are computationally intractable. In this paper, we develop error bounds for a computationally tractable estimator which also applies to realistic compressive systems obeying the required constraints. The focus of our technique is on the replacement of the generalized Kullback-Leibler divergence, with an information theoretic metric -namely the square root of the Jensen-Shannon divergence, which is related to an approximate, symmetrized version of the Poisson log likelihood function. We show that this replacement allows for very simple proofs of the error bounds, as it proposes and proves several interesting statistical properties of the square root of Jensen-Shannon divergence, and exploits other known ones. Numerical experiments are performed showing the practical use of the technique in signal and image