Abstract
The convergence of the linear δ expansion is studied in the context of the integral I:=dx, which corresponds to massless theory in 0 dimensions. The method consists of rewriting the exponent as -δ-λ(1-δ) and expanding in powers of δ. The arbitrary parameter λ is fixed by the principle of minimal sensitivity, ∂(λ)/∂λ=0, where is the expansion truncated at order K with δ set equal to 1. This has a solution λ only for K odd, when it gives very good numerical results. We are able to show analytically, using saddle-point methods, that the sequence of approximants (λ) is convergent, the error decreasing exponentially with K, even though for fixed λ the series expansion is a divergent alternating series.
- Received 21 July 1992
DOI:https://doi.org/10.1103/PhysRevD.47.2554
©1993 American Physical Society