Abstract
The effect of electron correlations in the impurity conductance of the shallow-donor impurity band in a semiconductor quantum wire, connected by two ideal leads, is studied by using the Hubbard model in an alloy-analogy approximation. The hopping integral and the intrasite Coulomb interaction energy are estimated numerically from variational wave functions for random impurity configurations. For one electron per impurity, it is shown that there is a considerable reduction in the impurity conduction due to electron correlations. For a given impurity concentration, the disordered wire turns into an insulator at a much shorter sample length than that estimated previously by neglecting correlations.
- Received 7 December 1992
DOI:https://doi.org/10.1103/PhysRevB.47.10920
©1993 American Physical Society