Optimal calibration accuracy for gravitational-wave detectors

Lee Lindblom
Phys. Rev. D 80, 042005 – Published 18 August 2009

Abstract

Calibration errors in the response function of a gravitational-wave detector degrade its ability to detect and then to measure the properties of any detected signals. This paper derives the needed levels of calibration accuracy for each of these data-analysis tasks. The levels derived here are optimal in the sense that lower accuracy would result in missed detections and/or a loss of measurement precision, while higher accuracy would be made irrelevant by the intrinsic noise level of the detector. Calibration errors affect the data-analysis process in much the same way as errors in theoretical waveform templates. The optimal level of calibration accuracy is expressed therefore as a joint limit on modeling and calibration errors: increased accuracy in one reduces the accuracy requirement in the other.

  • Figure
  • Figure
  • Received 28 June 2009

DOI:https://doi.org/10.1103/PhysRevD.80.042005

©2009 American Physical Society

Authors & Affiliations

Lee Lindblom

  • Theoretical Astrophysics 350-17, California Institute of Technology, Pasadena, California 91125, USA

Article Text (Subscription Required)

Click to Expand

References (Subscription Required)

Click to Expand
Issue

Vol. 80, Iss. 4 — 15 August 2009

Reuse & Permissions
Access Options
Author publication services for translation and copyediting assistance advertisement

Authorization Required


×
×

Images

×

Sign up to receive regular email alerts from Physical Review D

Log In

Cancel
×

Search


Article Lookup

Paste a citation or DOI

Enter a citation
×