Abstract
The sensitivity afforded by quantum sensors is limited by decoherence. Quantum error correction (QEC) can enhance sensitivity by suppressing decoherence, but it has a side effect: it biases a sensor’s output in realistic settings. If unaccounted for, this bias can systematically reduce a sensor’s performance in experiment, and also give misleading values for the minimum detectable signal in theory. We analyze this effect in the experimentally motivated setting of continuous-time QEC, showing both how one can remedy it, and how incorrect results can arise when one does not.
- Received 22 January 2021
- Revised 21 December 2021
- Accepted 28 January 2022
DOI:https://doi.org/10.1103/PhysRevLett.128.140503
© 2022 American Physical Society