From post

Please choose a person to relate this publication to

To differ between persons with the same name, the academic degree and the title of an important publication will be displayed.

 

Другие публикации лиц с тем же именем

Answering Complex Open-Domain Questions with Multi-Hop Dense Retrieval., , , , , , , , , и 1 other автор(ы). CoRR, (2020)Supervised Contrastive Learning for Pre-trained Language Model Fine-tuning., , , и . ICLR, OpenReview.net, (2021)Multi-Objective Optimization for Overlapping Community Detection., , и . ADMA (2), том 8347 из Lecture Notes in Computer Science, стр. 489-500. Springer, (2013)Speech-to-Speech Translation for a Real-world Unwritten Language., , , , , , , , , и 6 other автор(ы). ACL (Findings), стр. 4969-4983. Association for Computational Linguistics, (2023)Improving In-Context Few-Shot Learning via Self-Supervised Training., , , , , , и . NAACL-HLT, стр. 3558-3573. Association for Computational Linguistics, (2022)Self-training Improves Pre-training for Natural Language Understanding., , , , , , , и . NAACL-HLT, стр. 5408-5418. Association for Computational Linguistics, (2021)SpeechMatrix: A Large-Scale Mined Corpus of Multilingual Speech-to-Speech Translations., , , , , , , , , и . ACL (1), стр. 16251-16269. Association for Computational Linguistics, (2023)Box office prediction based on microblog., , и . Expert Syst. Appl., 41 (4): 1680-1689 (2014)Self-training Improves Pre-training for Natural Language Understanding., , , , , , , и . CoRR, (2020)Supervised Contrastive Learning for Pre-trained Language Model Fine-tuning., , , и . (2020)cite arxiv:2011.01403.