Li Z, Zhang W, Cechnicka S, Kainz B (2025)
Publication Type: Conference contribution
Publication year: 2025
Publisher: Springer Science and Business Media Deutschland GmbH
Book Volume: 15641 LNCS
Pages Range: 68-82
Conference Proceedings Title: Lecture Notes in Computer Science
Event location: Milan, ITA
ISBN: 9783031938054
DOI: 10.1007/978-3-031-93806-1_6
While deep learning techniques have proven successful in image-related tasks, the exponentially increased data storage and computation costs become a significant challenge. Dataset distillation addresses these challenges by synthesizing only a few images for each class that encapsulate all essential information. Most current methods focus on matching. The problems lie in the synthetic images not being human-readable and the dataset performance being insufficient for downstream learning tasks. Moreover, the distillation time can quickly get out of bounds when the number of synthetic images per class increases even slightly. To address this, we train a class conditional latent diffusion model capable of generating realistic synthetic images with labels. The sampling time can be reduced to several tens of images per seconds. We demonstrate that models can be effectively trained using only a small set of synthetic images and evaluated on a large real test set. Our approach achieved rank 1 in The First Dataset Distillation Challenge at ECCV 2024 on the CIFAR100 and TinyImageNet datasets.
APA:
Li, Z., Zhang, W., Cechnicka, S., & Kainz, B. (2025). Data-Efficient Generation for Dataset Distillation. In Alessio Del Bue, Cristian Canton, Jordi Pont-Tuset, Tatiana Tommasi (Eds.), Lecture Notes in Computer Science (pp. 68-82). Milan, ITA: Springer Science and Business Media Deutschland GmbH.
MLA:
Li, Zhe, et al. "Data-Efficient Generation for Dataset Distillation." Proceedings of the Workshops that were held in conjunction with the 18th European Conference on Computer Vision, ECCV 2024, Milan, ITA Ed. Alessio Del Bue, Cristian Canton, Jordi Pont-Tuset, Tatiana Tommasi, Springer Science and Business Media Deutschland GmbH, 2025. 68-82.
BibTeX: Download