Abstract
We argue that the current heterogeneous computing environment mimics a complex nonlinear system which needs to borrow the concept of time-scale separation and the delayed difference approach from statistical mechanics and nonlinear dynamics. We show that by replacing the usual difference equations approach by a delayed difference equations approach, the sequential fraction of many scientific computing algorithms can be substantially reduced. We also provide a comprehensive theoretical analysis to establish that the error and stability of our scheme is of the same order as existing schemes for a large, well-characterized class of problems.
- Received 9 April 2014
DOI:https://doi.org/10.1103/PhysRevLett.113.218701
© 2014 American Physical Society