Abstract
Models of strongly correlated electrons that tend to phase separate are studied including a long-range 1/r repulsive interaction. It is observed that charge-density-wave states become stable as the strength of the 1/r term, , is increased. Due to this effect, the domain of stability of the superconducting phases that appear near phase separation at =0 is not enlarged by a 1/r interaction as naively expected. Nevertheless, superconductivity exists in a finite region of parameter space, even if phase separation is suppressed. Our results have implications for some theories of the cuprates.
- Received 21 September 1994
DOI:https://doi.org/10.1103/PhysRevB.51.5989
©1995 American Physical Society