Abstract
Deep neural network-based pretraining methods have achieved impressive results in many natural language processing tasks including text classification. However, their applicability to large-scale text classification with numerous categories (e.g., several thousands) is yet to be well-studied, where the training data is insufficient and skewed in terms of categories. In addition, existing pretraining methods usually involve excessive computation and memory overheads. In this paper, we develop a novel multi-pretraining framework for large-scale text classification. This multi-pretraining framework includes both a self-supervised pretraining and a weakly supervised pretraining. We newly introduce an out-of-context words detection task on the unlabeled data as the self-supervised pretraining. It captures the topic-consistency of words used in sentences, which is proven to be useful for text classification. In addition, we propose a weakly supervised pretraining, where labels for text classification are obtained automatically from an existing approach. Experimental results clearly show that both pretraining approaches are effective for large-scale text classification task. The proposed scheme exhibits significant improvements as much as 3.8% in terms of macro-averaging F1-score over strong pretraining methods, while being computationally efficient.
Original language | English |
---|---|
Title of host publication | Findings of the Association for Computational Linguistics Findings of ACL |
Subtitle of host publication | EMNLP 2020 |
Publisher | Association for Computational Linguistics (ACL) |
Pages | 2041-2050 |
Number of pages | 10 |
ISBN (Electronic) | 9781952148903 |
State | Published - 2020 |
Event | Findings of the Association for Computational Linguistics, ACL 2020: EMNLP 2020 - Virtual, Online Duration: 16 Nov 2020 → 20 Nov 2020 |
Publication series
Name | Findings of the Association for Computational Linguistics Findings of ACL: EMNLP 2020 |
---|
Conference
Conference | Findings of the Association for Computational Linguistics, ACL 2020: EMNLP 2020 |
---|---|
City | Virtual, Online |
Period | 16/11/20 → 20/11/20 |
Bibliographical note
Publisher Copyright:©2020 Association for Computational Linguistics