Hardware Implementation of Hyperbolic Tangent Activation Function for Floating Point Formats

Arvind TKR, Brand M, Heidorn C, Boppu S, Hannig F, Teich J (2020)


Publication Language: English

Publication Type: Conference contribution

Publication year: 2020

Publisher: IEEE

Conference Proceedings Title: Proceedings of the 24th International Symposium on VLSI Design and Test (VDAT)

Event location: Bhubaneswar IN

ISBN: 978-1-7281-9369-4

DOI: 10.1109/VDAT50263.2020.9190305

Abstract

In this paper, we present the efficient hardware implementation of hyperbolic tangent activation function, which is most widely used in artificial neural networks for accelerating machine learning applications. The proposed design considers the floating point representation of numbers for the first time, the nonlinear nature of the activation function while sampling, and uses a lookup table for implementation. The unique way of dividing the input range into bins which follows the binary pattern reduces the hardware implementation cost. Furthermore, the input data itself is used as the address for lookup table; thus, no extra cost involved in hashing the lookup table and involves only one memory access time resulting in faster and efficient hardware implementation. Our design proves to be 3x faster when compared to similar hardware implementations using CMOS 90nm process.

Authors with CRIS profile

Related research project(s)

How to cite

APA:

Arvind, T.K.R., Brand, M., Heidorn, C., Boppu, S., Hannig, F., & Teich, J. (2020). Hardware Implementation of Hyperbolic Tangent Activation Function for Floating Point Formats. In Proceedings of the 24th International Symposium on VLSI Design and Test (VDAT). Bhubaneswar, IN: IEEE.

MLA:

Arvind, T. K. R., et al. "Hardware Implementation of Hyperbolic Tangent Activation Function for Floating Point Formats." Proceedings of the 24th International Symposium on VLSI Design and Test (VDAT), Bhubaneswar IEEE, 2020.

BibTeX: Download