Abstract
We discuss an alternative to relative entropy as a measure of distance between mixed quantum states. The proposed quantity is an extension to the realm of quantum theory of the Jensen-Shannon divergence (JSD) between probability distributions. The JSD has several interesting properties. It arises in information theory and, unlike the Kullback-Leibler divergence, it is symmetric, always well-defined, and bounded. We show that the quantum JSD shares with the relative entropy most of the physically relevant properties, in particular those required for a “good” quantum distinguishability measure. We relate it to other known quantum distances and we suggest possible applications in the field of the quantum information theory.
- Received 19 August 2005
DOI:https://doi.org/10.1103/PhysRevA.72.052310
©2005 American Physical Society