Schnell M, Gourmelon N, Christlein V, Seuret M (2025)
Publication Type: Conference contribution, Conference Contribution
Publication year: 2025
The rapid development of modern neural networks has led to highly over-parameterized
models, resulting in excessive memory usage, computation, and energy consumption at
inference time. In this paper, we propose a structured pruning method inspired by the
Lottery Ticket Hypothesis, aiming to reduce the network size while preserving accuracy.
Our method removes entire neurons based on a magnitude-based selection criterion – un-
like the conventional unstructured approach of setting weights to zero via the utilization
of binary masks. Hereby, we demonstrate the efficacy of magnitude-based layer and
neuron selection techniques that guide our structured pruning algorithm without the ne-
cessity of complex search patterns. We validate our method on two distinct scenarios:
the well known CIFAR-100 dataset, and a document image analysis task. We evalu-
ate the benefits of our methodology using GPU-based energy measurements and show
that our pruned networks can reduce the energy consumption per sample by more than
40 % [Wh/sample], with comparable or slightly superior test accuracies. These findings
highlight the potential of structured pruning to create energy-efficient neural networks
suitable for deployment in resource-constrained environments.
APA:
Schnell, M., Gourmelon, N., Christlein, V., & Seuret, M. (2025). Energy Savings Playing the Lottery. In Proceedings of the The 36th British Machine Vision Conference. Sheffield, GB.
MLA:
Schnell, Marco, et al. "Energy Savings Playing the Lottery." Proceedings of the The 36th British Machine Vision Conference, Sheffield 2025.
BibTeX: Download