Abstract
Using supersymmetry calculations and random matrix simulations, we study the decay of the average of the fidelity amplitude , where differs from by a slight perturbation characterized by the parameter . For strong perturbations a recovery of at the Heisenberg time is found. It is most pronounced for the Gaussian symplectic ensemble, and least for the Gaussian orthogonal one. Using Dyson’s Brownian-motion model for an eigenvalue crystal, the recovery is interpreted in terms of a spectral analogue of the Debye-Waller factor known from solid state physics, describing the decrease of x-ray and neutron diffraction peaks with temperature due to lattice vibrations.
- Received 27 July 2004
DOI:https://doi.org/10.1103/PhysRevLett.94.244101
©2005 American Physical Society