Abstract
Proposed near-future upgrades of the current advanced interferometric gravitational wave detectors include the usage of frequency dependent squeezed light to reduce the current sensitivity-limiting quantum noise. We quantify and describe the degradation effects that spatial mode-mismatches between optical resonators have on the squeezed field. These mode-mismatches can to first order be described by scattering of light into second-order Gaussian modes. As a demonstration of principle, we also show that squeezing the second-order Hermite-Gaussian modes and , in addition to the fundamental mode, has the potential to increase the robustness to spatial mode-mismatches. This scheme, however, requires independently optimised squeeze angles for each squeezed spatial mode, which would be challenging to realise in practise.
5 More- Received 27 April 2017
DOI:https://doi.org/10.1103/PhysRevD.96.022006
© 2017 American Physical Society