Sabih M, Abdo M, Hannig F, Teich J (2025)
Publication Type: Journal article
Publication year: 2025
Book Volume: 17
Pages Range: 329-332
Journal Issue: 5
Binary neural networks (BNNs) are known for their minimal memory requirements, making them an attractive choice for resource-constrained environments. SBNN-nps are a more recent advancement that extend the benefits of BNNs by compressing them even further, achieving sub-bit level representations to maximize efficiency. However, effectively compressing and accelerating BNNs presents challenges. In this letter, we propose a novel approach to compress BNNs using a fixed-length compression scheme that can be efficiently decoded at runtime. We then propose RISC-V extensions, implemented as a custom function unit (CFU), to decode compressed weights via a codebook stored on an FPGA on-board memory, followed by XOR and population count operations. This approach achieves a speedup of up to 2× compared to conventional BNNs deployed on the RISC-V softcore, with Significantly less accuracy degradation, and provides a foundation for exploring even higher compression configurations to improve performance further.
APA:
Sabih, M., Abdo, M., Hannig, F., & Teich, J. (2025). Beyond BNNs: Design and Acceleration of Sub-Bit Neural Networks Using RISC-V Custom Functional Units. IEEE Embedded Systems Letters, 17(5), 329-332. https://doi.org/10.1109/LES.2025.3600565
MLA:
Sabih, Muhammad, et al. "Beyond BNNs: Design and Acceleration of Sub-Bit Neural Networks Using RISC-V Custom Functional Units." IEEE Embedded Systems Letters 17.5 (2025): 329-332.
BibTeX: Download