A low-power rram memory block for embedded, multi-level weight and bias storage in artificial neural networks

Pechmann S, Mai T, Potschka J, Reiser D, Reichel P, Breiling M, Reichenbach M, Hagelauer A (2021)


Publication Type: Journal article

Publication year: 2021

Journal

Book Volume: 12

Article Number: 1277

Journal Issue: 11

DOI: 10.3390/mi12111277

Abstract

Pattern recognition as a computing task is very well suited for machine learning algorithms utilizing artificial neural networks (ANNs). Computing systems using ANNs usually require some sort of data storage to store the weights and bias values for the processing elements of the individual neurons. This paper introduces a memory block using resistive memory cells (RRAM) to realize this weight and bias storage in an embedded and distributed way while also offering programming and multi-level ability. By implementing power gating, overall power consumption is decreased significantly without data loss by taking advantage of the non-volatility of the RRAM technology. Due to the versatility of the peripheral circuitry, the presented memory concept can be adapted to different applications and RRAM technologies.

Authors with CRIS profile

Involved external institutions

How to cite

APA:

Pechmann, S., Mai, T., Potschka, J., Reiser, D., Reichel, P., Breiling, M.,... Hagelauer, A. (2021). A low-power rram memory block for embedded, multi-level weight and bias storage in artificial neural networks. Micromachines, 12(11). https://dx.doi.org/10.3390/mi12111277

MLA:

Pechmann, Stefan, et al. "A low-power rram memory block for embedded, multi-level weight and bias storage in artificial neural networks." Micromachines 12.11 (2021).

BibTeX: Download