Jamali V, Tulino A, Llorca J, Erkip E (2020)
Publication Type: Conference contribution
Publication year: 2020
Publisher: Institute of Electrical and Electronics Engineers Inc.
Book Volume: 2020-June
Pages Range: 2807-2812
Conference Proceedings Title: IEEE International Symposium on Information Theory - Proceedings
Event location: Los Angeles, CA
ISBN: 9781728164328
DOI: 10.1109/ISIT44484.2020.9174162
Semi-supervised classification, one of the most prominent fields in machine learning, studies how to combine the statistical knowledge of the often abundant unlabeled data with the often limited labeled data in order to maximize overall classification accuracy. In this context, the process of actively choosing the data to be labeled is referred to as active learning. In this paper, we initiate the non-asymptotic analysis of the optimal policy for semi-supervised classification with actively obtained labeled data. Considering a general Bayesian classification model, we provide the first characterization of the jointly optimal active learning and semi-supervised classification policy, in terms of the cost-performance tradeoff driven by the label query budget (number of data items to be labeled) and overall classification accuracy. Leveraging recent results on the Rényi Entropy, we derive tight information-theoretic bounds on such active learning cost-performance tradeoff.
APA:
Jamali, V., Tulino, A., Llorca, J., & Erkip, E. (2020). Rényi Entropy Bounds on the Active Learning Cost-Performance Tradeoff. In IEEE International Symposium on Information Theory - Proceedings (pp. 2807-2812). Los Angeles, CA, US: Institute of Electrical and Electronics Engineers Inc..
MLA:
Jamali, Vahid, et al. "Rényi Entropy Bounds on the Active Learning Cost-Performance Tradeoff." Proceedings of the 2020 IEEE International Symposium on Information Theory, ISIT 2020, Los Angeles, CA Institute of Electrical and Electronics Engineers Inc., 2020. 2807-2812.
BibTeX: Download