Sabih M, Abdo M, Hannig F, Teich J (2025)
Publication Language: English
Publication Type: Journal article, Original article
Publication year: 2025
Book Volume: 17
Pages Range: 329-332
Conference Proceedings Title: CASES: International Conference on Compilers, Architectures, and Synthesis for Embedded Systems
Journal Issue: 5
Binary Neural Networks (BNNs) are known for their minimal memory requirements, making them an attractive choice for resource-constrained environments. Sub-Bit Neural Networks (SBNNs) are a more recent advancement that extend the benefits of BNNs by compressing them even further, achieving sub-bit level representations to maximize efficiency. However, effectively compressing and accelerating BNNs presents challenges. In this paper, we propose a novel approach to compress BNNs using a fixed-length compression scheme that can be efficiently decoded at runtime. We then propose RISC-V extensions, implemented as a Custom Functional Unit (CFU), to decode compressed weights via a codebook stored on an FPGA on-board memory, followed by XOR and population count operations. This approach achieves a speedup of up to 2× compared to conventional BNNs deployed on the RISC-V softcore, with significantly less accuracy degradation, and provides a foundation for exploring even higher compression configurations to improve performance further.
APA:
Sabih, M., Abdo, M., Hannig, F., & Teich, J. (2025). Beyond BNNs: Design and Acceleration of Sub-Bit Neural Networks using RISC-V Custom Functional Units. IEEE Embedded Systems Letters, 17(5), 329-332. https://doi.org/10.1109/LES.2025.3600565
MLA:
Sabih, Muhammad, et al. "Beyond BNNs: Design and Acceleration of Sub-Bit Neural Networks using RISC-V Custom Functional Units." IEEE Embedded Systems Letters 17.5 (2025): 329-332.
BibTeX: Download