Fischer K, Brand F, Herglotz C, Kaup A (2022)
Publication Language: English
Publication Type: Conference contribution, Conference Contribution
Publication year: 2022
Pages Range: 1-5
URI: https://arxiv.org/abs/2301.08533
DOI: 10.1109/ICIP46576.2022.9897987
Open Access Link: https://arxiv.org/abs/2301.08533
Today, visual data is often analyzed by a neural network without any human being involved, which demands for specialized codecs. For standard-compliant codec adaptations towards certain information sinks, HEVC or VVC provide the possibility of frequency-specific quantization with scaling lists. This is a well-known method for the human visual system, where scaling lists are derived from psycho-visual models. In this work, we employ scaling lists when performing VVC intra coding for neural networks as information sink. To this end, we propose a novel data-driven method to obtain optimal scaling lists for arbitrary neural networks. Experiments with Mask R-CNN as information sink reveal that coding the Cityscapes dataset with the proposed scaling lists result in peak bitrate savings of 8.9% over VVC with constant quantization. By that, our approach also outperforms scaling lists optimized for the human visual system. The generated scaling lists can be found under https://github.com/FAU-LMS/VCM_scaling_lists.
APA:
Fischer, K., Brand, F., Herglotz, C., & Kaup, A. (2022). Learning Frequency-Specific Quantization Scaling in VVC for Standard-Compliant Task-Driven Image Coding. In Proceedings of the IEEE International Conference on Image Processing (ICIP) (pp. 1-5). Bordeaux, FR.
MLA:
Fischer, Kristian, et al. "Learning Frequency-Specific Quantization Scaling in VVC for Standard-Compliant Task-Driven Image Coding." Proceedings of the IEEE International Conference on Image Processing (ICIP), Bordeaux 2022. 1-5.
BibTeX: Download