Mattick A, Mayr M, Maier A, Christlein V (2022)
Publication Type: Conference contribution
Publication year: 2022
Publisher: Springer Science and Business Media Deutschland GmbH
Book Volume: 13237 LNCS
Pages Range: 674-687
Conference Proceedings Title: Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
ISBN: 9783031065545
DOI: 10.1007/978-3-031-06555-2_45
Multitask learning has been a common technique for improving representations learned by artificial neural networks for decades. However, the actual effects and trade-offs are not much explored, especially in the context of document analysis. We demonstrate a simple and realistic scenario on real-world datasets that produces noticeably inferior results in a multitask learning setting than in a single-task setting. We hypothesize that slight data-manifold and task semantic shifts are sufficient to lead to adversarial competition of tasks inside networks and demonstrate this experimentally in two different multitask learning formulations.
APA:
Mattick, A., Mayr, M., Maier, A., & Christlein, V. (2022). Is Multitask Learning Always Better? In Seiichi Uchida, Elisa Barney, Véronique Eglin (Eds.), Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (pp. 674-687). La Rochelle, FR: Springer Science and Business Media Deutschland GmbH.
MLA:
Mattick, Alexander, et al. "Is Multitask Learning Always Better?" Proceedings of the 15th IAPR International Workshop on Document Analysis Systems, DAS 2022, La Rochelle Ed. Seiichi Uchida, Elisa Barney, Véronique Eglin, Springer Science and Business Media Deutschland GmbH, 2022. 674-687.
BibTeX: Download