Pre-Trained Language Models and Their Applications

Haifeng Wang , Jiwei Li , Hua Wu , Eduard Hovy , Yu Sun

Engineering ›› 2023, Vol. 25 ›› Issue (6) : 51 -65.

PDF (748KB)
Engineering ›› 2023, Vol. 25 ›› Issue (6) : 51 -65. DOI: 10.1016/j.eng.2022.04.024
Research
Review

Pre-Trained Language Models and Their Applications

Author information +
History +
PDF (748KB)

Abstract

Pre-trained language models have achieved striking success in natural language processing (NLP), leading to a paradigm shift from supervised learning to pre-training followed by fine-tuning. The NLP community has witnessed a surge of research interest in improving pre-trained models. This article presents a comprehensive review of representative work and recent progress in the NLP field and introduces the taxonomy of pre-trained models. We first give a brief introduction of pre-trained models, followed by characteristic methods and frameworks. We then introduce and analyze the impact and challenges of pre-trained models and their downstream applications. Finally, we briefly conclude and address future research directions in this field.

Keywords

Pre-trained models / Natural language processing

Cite this article

Download citation ▾
Haifeng Wang, Jiwei Li, Hua Wu, Eduard Hovy, Yu Sun. Pre-Trained Language Models and Their Applications. Engineering, 2023, 25(6): 51-65 DOI:10.1016/j.eng.2022.04.024

登录浏览全文

4963

注册一个新账户 忘记密码

References

Funding

()

AI Summary AI Mindmap
PDF (748KB)

200

Accesses

0

Citation

Detail

Sections
Recommended

AI思维导图

/