Holzbock A, Hegde A, Dietmayer K, Belagiannis V (2023)
Publication Type: Conference contribution
Publication year: 2023
Publisher: European Signal Processing Conference, EUSIPCO
Pages Range: 1255-1259
Conference Proceedings Title: European Signal Processing Conference
Event location: Helsinki, FIN
ISBN: 9789464593600
DOI: 10.23919/EUSIPCO58844.2023.10290102
Model compression techniques reduce the computational load and memory consumption of deep neural networks. After the compression operation, e.g. parameter pruning, the model is normally fine-tuned on the original training dataset to recover from the performance drop caused by compression. However, the training data is not always available due to privacy issues or other factors. In this work, we present a data-free fine-tuning approach for pruning the backbone of deep neural networks. In particular, the pruned network backbone is trained with synthetically generated images, and our proposed intermediate supervision to mimic the unpruned backbone's output feature map. Afterwards, the pruned backbone can be combined with the original network head to make predictions. We generate synthetic images by back-propagating gradients to noise images while relying on L1-pruning for the backbone pruning. In our experiments, we show that our approach is task-independent due to pruning only the backbone. By evaluating our approach on 2D human pose estimation, object detection, and image classification, we demonstrate promising performance compared to the unpruned model. Our code is available at https://github.com/holzbock/dfbf.
APA:
Holzbock, A., Hegde, A., Dietmayer, K., & Belagiannis, V. (2023). DATA-FREE BACKBONE FINE-TUNING FOR PRUNED NEURAL NETWORKS. In European Signal Processing Conference (pp. 1255-1259). Helsinki, FIN: European Signal Processing Conference, EUSIPCO.
MLA:
Holzbock, Adrian, et al. "DATA-FREE BACKBONE FINE-TUNING FOR PRUNED NEURAL NETWORKS." Proceedings of the 31st European Signal Processing Conference, EUSIPCO 2023, Helsinki, FIN European Signal Processing Conference, EUSIPCO, 2023. 1255-1259.
BibTeX: Download