Abstract
We study entropy generation in a one-dimensional (1D) model of bosons in an optical lattice experiencing two-particle losses. Such heating is a major impediment to observing exotic low temperature states, and “simulating” condensed matter systems. Developing intuition through numerical simulations, we present a simple empirical model for the entropy produced in this 1D setting. We also explore the time evolution of one- and two-particle correlation functions, showing that they are robust against two-particle loss. Because of this robustness, induced two-body losses can be used as a probe of short-range magnetic correlations.
- Received 26 March 2010
DOI:https://doi.org/10.1103/PhysRevA.82.023626
©2010 American Physical Society