Abstract
Text generation is an essential research area in artificial intelligence (AI) technology and and provides key technical support for the rapid development of AI-generated content (AIGC). It is based on technologies such as , , and , which enable learning language rules through training models to automatically generate text that meets grammatical and semantic requirements. In this paper, we sort and systematically summarize the main research progress in text generation and review recent text generation papers, focusing on presenting a detailed understanding of the technical models. In addition, several typical text generation application systems are presented. Finally, we address some challenges and future directions in . We conclude that improving the quality, quantity, interactivity, and adaptability of generated text can help fundamentally advance development.