Share Email Print
cover

Proceedings Paper • new

Semi-supervised learning and inference in domain-wall magnetic tunnel junction (DW-MTJ) neural networks
Author(s): Christopher H. Bennett; Naimul Hassan; Xuan Hu; Jean Anne C. Incornvia; Joseph S. Friedman; Matthew J. Marinella
Format Member Price Non-Member Price
PDF $17.00 $21.00

Paper Abstract

Advances in machine intelligence have sparked interest in hardware accelerators to implement these algorithms, yet embedded electronics have stringent power, area budgets, and speed requirements that may limit non- volatile memory (NVM) integration. In this context, the development of fast nanomagnetic neural networks using minimal training data is attractive. Here, we extend an inference-only proposal using the intrinsic physics of domain-wall MTJ (DW-MTJ) neurons for online learning to implement fully unsupervised pattern recognition operation, using winner-take-all networks that contain either random or plastic synapses (weights). Meanwhile, a read-out layer trains in a supervised fashion. We find our proposed design can approach state-of-the-art success on the task relative to competing memristive neural network proposals, while eliminating much of the area and energy overhead that would typically be required to build the neuronal layers with CMOS devices.

Paper Details

Date Published: 16 September 2019
PDF: 7 pages
Proc. SPIE 11090, Spintronics XII, 110903I (16 September 2019); doi: 10.1117/12.2530308
Show Author Affiliations
Christopher H. Bennett, Sandia National Labs. (United States)
Naimul Hassan, The Univ. of Texas at Dallas (United States)
Xuan Hu, The Univ. of Texas at Dallas (United States)
Jean Anne C. Incornvia, The Univ. of Texas at Austin (United States)
Joseph S. Friedman, The Univ. of Texas at Dallas (United States)
Matthew J. Marinella, Sandia National Labs. (United States)


Published in SPIE Proceedings Vol. 11090:
Spintronics XII
Henri-Jean M. Drouhin; Jean-Eric Wegrowe; Manijeh Razeghi, Editor(s)

© SPIE. Terms of Use
Back to Top