Abstract
Superparamagnetic tunnel junctions (SMTJs) have emerged as a competitive, realistic nanotechnology to support novel forms of stochastic computation in CMOS-compatible platforms. One of their applications is to generate random bitstreams suitable for use in stochastic computing implementations. We describe a method for digitally programmable bitstream generation based on precharge sense amplifiers. This generator is significantly more energy efficient than SMTJ-based bitstream generators that tune probabilities with spin currents and a factor of 2 more efficient than related CMOS-based implementations. The true randomness of this bitstream generator allows us to use them as the fundamental units of a novel neural network architecture. To take advantage of the potential savings, we codesign the algorithm with the circuit, rather than directly transcribing a classical neural network into hardware. The flexibility of the neural network mathematics allows us to adapt the network to the explicitly energy-efficient choices we make at the device level. The result is a convolutional neural network design operating at approximately nJ per inference with performance on the MNIST data set—a factor of 1.4 to 7.7 improvement in energy efficiency over comparable proposals in the recent literature.
5 More- Received 25 November 2019
- Revised 21 January 2020
- Accepted 18 February 2020
DOI:https://doi.org/10.1103/PhysRevApplied.13.034016
© 2020 American Physical Society